Mar 21 04:23:14 crc systemd[1]: Starting Kubernetes Kubelet... Mar 21 04:23:14 crc restorecon[4683]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:23:14 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:15 crc restorecon[4683]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 21 04:23:15 crc restorecon[4683]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 21 04:23:16 crc kubenswrapper[4839]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 04:23:16 crc kubenswrapper[4839]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 21 04:23:16 crc kubenswrapper[4839]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 04:23:16 crc kubenswrapper[4839]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 04:23:16 crc kubenswrapper[4839]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 21 04:23:16 crc kubenswrapper[4839]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.139700 4839 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143235 4839 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143258 4839 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143266 4839 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143274 4839 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143281 4839 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143301 4839 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143310 4839 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143317 4839 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143324 4839 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143333 4839 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143341 4839 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143349 4839 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143356 4839 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143363 4839 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143369 4839 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143375 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143381 4839 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143387 4839 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143396 4839 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143404 4839 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143410 4839 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143417 4839 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143423 4839 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143429 4839 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143435 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143441 4839 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143447 4839 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143454 4839 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143460 4839 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143466 4839 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143474 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143480 4839 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143486 4839 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143492 4839 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143499 4839 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143505 4839 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143511 4839 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143517 4839 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143525 4839 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143534 4839 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143541 4839 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143548 4839 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143554 4839 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143560 4839 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143589 4839 feature_gate.go:330] unrecognized feature gate: Example Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143597 4839 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143603 4839 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143609 4839 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143615 4839 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143621 4839 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143629 4839 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143635 4839 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143641 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143647 4839 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143653 4839 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143660 4839 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143666 4839 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143673 4839 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143683 4839 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143691 4839 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143699 4839 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143706 4839 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143712 4839 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143719 4839 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143725 4839 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143732 4839 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143750 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143759 4839 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143765 4839 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143771 4839 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.143780 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144541 4839 flags.go:64] FLAG: --address="0.0.0.0" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144567 4839 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144601 4839 flags.go:64] FLAG: --anonymous-auth="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144612 4839 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144621 4839 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144629 4839 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144639 4839 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144647 4839 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144655 4839 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144662 4839 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144670 4839 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144678 4839 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144685 4839 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144692 4839 flags.go:64] FLAG: --cgroup-root="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144699 4839 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144707 4839 flags.go:64] FLAG: --client-ca-file="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144714 4839 flags.go:64] FLAG: --cloud-config="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144721 4839 flags.go:64] FLAG: --cloud-provider="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144728 4839 flags.go:64] FLAG: --cluster-dns="[]" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144747 4839 flags.go:64] FLAG: --cluster-domain="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144754 4839 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144762 4839 flags.go:64] FLAG: --config-dir="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144769 4839 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144777 4839 flags.go:64] FLAG: --container-log-max-files="5" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144787 4839 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144794 4839 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144802 4839 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144809 4839 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144816 4839 flags.go:64] FLAG: --contention-profiling="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144823 4839 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144830 4839 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144849 4839 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144857 4839 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144883 4839 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144891 4839 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144898 4839 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144905 4839 flags.go:64] FLAG: --enable-load-reader="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144912 4839 flags.go:64] FLAG: --enable-server="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144919 4839 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144933 4839 flags.go:64] FLAG: --event-burst="100" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144941 4839 flags.go:64] FLAG: --event-qps="50" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144948 4839 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144955 4839 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144962 4839 flags.go:64] FLAG: --eviction-hard="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144971 4839 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144978 4839 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144985 4839 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144992 4839 flags.go:64] FLAG: --eviction-soft="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.144999 4839 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145006 4839 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145013 4839 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145020 4839 flags.go:64] FLAG: --experimental-mounter-path="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145028 4839 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145034 4839 flags.go:64] FLAG: --fail-swap-on="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145041 4839 flags.go:64] FLAG: --feature-gates="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145050 4839 flags.go:64] FLAG: --file-check-frequency="20s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145057 4839 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145065 4839 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145072 4839 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145079 4839 flags.go:64] FLAG: --healthz-port="10248" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145087 4839 flags.go:64] FLAG: --help="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145094 4839 flags.go:64] FLAG: --hostname-override="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145101 4839 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145109 4839 flags.go:64] FLAG: --http-check-frequency="20s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145116 4839 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145123 4839 flags.go:64] FLAG: --image-credential-provider-config="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145130 4839 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145148 4839 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145157 4839 flags.go:64] FLAG: --image-service-endpoint="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145164 4839 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145171 4839 flags.go:64] FLAG: --kube-api-burst="100" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145178 4839 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145186 4839 flags.go:64] FLAG: --kube-api-qps="50" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145193 4839 flags.go:64] FLAG: --kube-reserved="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145200 4839 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145207 4839 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145214 4839 flags.go:64] FLAG: --kubelet-cgroups="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145221 4839 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145228 4839 flags.go:64] FLAG: --lock-file="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145235 4839 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145243 4839 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145250 4839 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.145261 4839 flags.go:64] FLAG: --log-json-split-stream="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146045 4839 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146061 4839 flags.go:64] FLAG: --log-text-split-stream="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146069 4839 flags.go:64] FLAG: --logging-format="text" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146076 4839 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146084 4839 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146092 4839 flags.go:64] FLAG: --manifest-url="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146099 4839 flags.go:64] FLAG: --manifest-url-header="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146109 4839 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146117 4839 flags.go:64] FLAG: --max-open-files="1000000" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146126 4839 flags.go:64] FLAG: --max-pods="110" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146133 4839 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146141 4839 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146149 4839 flags.go:64] FLAG: --memory-manager-policy="None" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146156 4839 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146164 4839 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146172 4839 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146182 4839 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146201 4839 flags.go:64] FLAG: --node-status-max-images="50" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146209 4839 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146216 4839 flags.go:64] FLAG: --oom-score-adj="-999" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146307 4839 flags.go:64] FLAG: --pod-cidr="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146319 4839 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146337 4839 flags.go:64] FLAG: --pod-manifest-path="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146344 4839 flags.go:64] FLAG: --pod-max-pids="-1" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146352 4839 flags.go:64] FLAG: --pods-per-core="0" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146359 4839 flags.go:64] FLAG: --port="10250" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146367 4839 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146374 4839 flags.go:64] FLAG: --provider-id="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146381 4839 flags.go:64] FLAG: --qos-reserved="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146388 4839 flags.go:64] FLAG: --read-only-port="10255" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146396 4839 flags.go:64] FLAG: --register-node="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146403 4839 flags.go:64] FLAG: --register-schedulable="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146410 4839 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146423 4839 flags.go:64] FLAG: --registry-burst="10" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146431 4839 flags.go:64] FLAG: --registry-qps="5" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146437 4839 flags.go:64] FLAG: --reserved-cpus="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146444 4839 flags.go:64] FLAG: --reserved-memory="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146453 4839 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146461 4839 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146468 4839 flags.go:64] FLAG: --rotate-certificates="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146475 4839 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146482 4839 flags.go:64] FLAG: --runonce="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146489 4839 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146497 4839 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146505 4839 flags.go:64] FLAG: --seccomp-default="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146517 4839 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146524 4839 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146532 4839 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146539 4839 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146547 4839 flags.go:64] FLAG: --storage-driver-password="root" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146554 4839 flags.go:64] FLAG: --storage-driver-secure="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146561 4839 flags.go:64] FLAG: --storage-driver-table="stats" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146589 4839 flags.go:64] FLAG: --storage-driver-user="root" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146597 4839 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146604 4839 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146612 4839 flags.go:64] FLAG: --system-cgroups="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146631 4839 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146644 4839 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146651 4839 flags.go:64] FLAG: --tls-cert-file="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146658 4839 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146673 4839 flags.go:64] FLAG: --tls-min-version="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146680 4839 flags.go:64] FLAG: --tls-private-key-file="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146687 4839 flags.go:64] FLAG: --topology-manager-policy="none" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146694 4839 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146702 4839 flags.go:64] FLAG: --topology-manager-scope="container" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146709 4839 flags.go:64] FLAG: --v="2" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146718 4839 flags.go:64] FLAG: --version="false" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146727 4839 flags.go:64] FLAG: --vmodule="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146736 4839 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.146744 4839 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.146965 4839 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.146976 4839 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.146983 4839 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.146990 4839 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.146996 4839 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147002 4839 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147009 4839 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147016 4839 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147022 4839 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147028 4839 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147035 4839 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147041 4839 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147047 4839 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147053 4839 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147059 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147065 4839 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147071 4839 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147077 4839 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147083 4839 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147090 4839 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147096 4839 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147102 4839 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147110 4839 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147116 4839 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147122 4839 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147129 4839 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147135 4839 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147142 4839 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147148 4839 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147154 4839 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147161 4839 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147167 4839 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147174 4839 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147180 4839 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147186 4839 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147193 4839 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147202 4839 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147210 4839 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147218 4839 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147228 4839 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147236 4839 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147243 4839 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147250 4839 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147258 4839 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147267 4839 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147274 4839 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147281 4839 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147287 4839 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147293 4839 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147299 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147306 4839 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147313 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147320 4839 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147326 4839 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147332 4839 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147339 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147345 4839 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147352 4839 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147369 4839 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147376 4839 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147383 4839 feature_gate.go:330] unrecognized feature gate: Example Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147390 4839 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147398 4839 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147406 4839 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147413 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147421 4839 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147428 4839 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147434 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147441 4839 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147447 4839 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.147453 4839 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.148855 4839 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.164172 4839 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.164225 4839 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164391 4839 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164418 4839 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164429 4839 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164438 4839 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164446 4839 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164455 4839 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164464 4839 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164472 4839 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164480 4839 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164487 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164495 4839 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164503 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164511 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164520 4839 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164527 4839 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164535 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164543 4839 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164550 4839 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164558 4839 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164596 4839 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164604 4839 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164613 4839 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164621 4839 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164629 4839 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164637 4839 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164645 4839 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164652 4839 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164660 4839 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164668 4839 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164677 4839 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164685 4839 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164693 4839 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164701 4839 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164711 4839 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164720 4839 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164729 4839 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164737 4839 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164745 4839 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164755 4839 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164763 4839 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164771 4839 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164778 4839 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164786 4839 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164794 4839 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164805 4839 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164815 4839 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164824 4839 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164833 4839 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164841 4839 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164849 4839 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164857 4839 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164865 4839 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164873 4839 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164880 4839 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164888 4839 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164896 4839 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164904 4839 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164911 4839 feature_gate.go:330] unrecognized feature gate: Example Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164920 4839 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164928 4839 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164936 4839 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164944 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164952 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164959 4839 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164967 4839 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164975 4839 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164983 4839 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164991 4839 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.164998 4839 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165008 4839 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165019 4839 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.165032 4839 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165268 4839 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165281 4839 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165291 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165301 4839 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165310 4839 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165319 4839 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165327 4839 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165336 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165344 4839 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165352 4839 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165359 4839 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165367 4839 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165375 4839 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165384 4839 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165392 4839 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165399 4839 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165408 4839 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165415 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165423 4839 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165431 4839 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165439 4839 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165447 4839 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165456 4839 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165465 4839 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165473 4839 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165481 4839 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165488 4839 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165496 4839 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165504 4839 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165512 4839 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165520 4839 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165528 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165536 4839 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165544 4839 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165552 4839 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165560 4839 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165594 4839 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165603 4839 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165611 4839 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165620 4839 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165628 4839 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165635 4839 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165643 4839 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165653 4839 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165663 4839 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165671 4839 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165679 4839 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165688 4839 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165697 4839 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165706 4839 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165714 4839 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165721 4839 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165729 4839 feature_gate.go:330] unrecognized feature gate: Example Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165737 4839 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165744 4839 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165755 4839 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165764 4839 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165772 4839 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165781 4839 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165790 4839 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165799 4839 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165806 4839 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165814 4839 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165822 4839 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165830 4839 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165838 4839 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165845 4839 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165853 4839 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165863 4839 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165873 4839 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.165881 4839 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.165893 4839 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.167231 4839 server.go:940] "Client rotation is on, will bootstrap in background" Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.175880 4839 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.184024 4839 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.184179 4839 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.186358 4839 server.go:997] "Starting client certificate rotation" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.186411 4839 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.186697 4839 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.213367 4839 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.216943 4839 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.217523 4839 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.234369 4839 log.go:25] "Validated CRI v1 runtime API" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.306657 4839 log.go:25] "Validated CRI v1 image API" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.308627 4839 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.319681 4839 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-21-04-18-13-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.319727 4839 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.342904 4839 manager.go:217] Machine: {Timestamp:2026-03-21 04:23:16.332186195 +0000 UTC m=+0.659972891 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:2a7bfad9-30ba-42d8-b982-971191ebb9d6 BootID:d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:2d:de:1b Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:2d:de:1b Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:50:03:86 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:94:21:78 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:14:0c:d2 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:3c:5a:55 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:f6:02:ef:bb:a6:9f Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:1e:2f:50:22:59:e2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.343231 4839 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.343544 4839 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.346140 4839 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.346325 4839 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.346356 4839 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.346554 4839 topology_manager.go:138] "Creating topology manager with none policy" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.346563 4839 container_manager_linux.go:303] "Creating device plugin manager" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.346935 4839 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.346959 4839 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.347118 4839 state_mem.go:36] "Initialized new in-memory state store" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.347212 4839 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.365109 4839 kubelet.go:418] "Attempting to sync node with API server" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.365155 4839 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.365181 4839 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.365195 4839 kubelet.go:324] "Adding apiserver pod source" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.365207 4839 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.371123 4839 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.371661 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.371820 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.371758 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.371909 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.372421 4839 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.377815 4839 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384456 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384514 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384528 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384540 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384559 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384593 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384603 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384619 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384654 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384665 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384696 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.384705 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.385967 4839 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.386614 4839 server.go:1280] "Started kubelet" Mar 21 04:23:16 crc systemd[1]: Started Kubernetes Kubelet. Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.389542 4839 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.389547 4839 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.393149 4839 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.393258 4839 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.393318 4839 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.393223 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.394137 4839 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.394171 4839 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.394178 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.394274 4839 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.395412 4839 server.go:460] "Adding debug handlers to kubelet server" Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.395707 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.395807 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.395823 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="200ms" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.403256 4839 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.403297 4839 factory.go:55] Registering systemd factory Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.403307 4839 factory.go:221] Registration of the systemd container factory successfully Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.403701 4839 factory.go:153] Registering CRI-O factory Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.403745 4839 factory.go:221] Registration of the crio container factory successfully Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.403790 4839 factory.go:103] Registering Raw factory Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.403813 4839 manager.go:1196] Started watching for new ooms in manager Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.404813 4839 manager.go:319] Starting recovery of all containers Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.404640 4839 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.188:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ec088a75d2ad4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,LastTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408324 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408401 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408417 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408431 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408444 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408456 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408468 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408481 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408497 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408509 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408521 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408535 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408547 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408598 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408610 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408622 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408633 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408668 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408682 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408695 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408707 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408719 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408732 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408745 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408758 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408770 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408812 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408830 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408843 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408855 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408867 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408881 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408895 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408908 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408920 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408961 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408975 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.408988 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409026 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409039 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409052 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409064 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409076 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409091 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409103 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409114 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409126 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409139 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409156 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409170 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409184 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409196 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409214 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409228 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409241 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409254 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409267 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409279 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409291 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409302 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409313 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409324 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409338 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409349 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409361 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409374 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409392 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409405 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409417 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409429 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409442 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409453 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409466 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409477 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409491 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409507 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409519 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409531 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409543 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409556 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409588 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409600 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409615 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409650 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409664 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409678 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409691 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409704 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409716 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409729 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409741 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409754 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409766 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409779 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409791 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409803 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409816 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409829 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409841 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409855 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409873 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409888 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409902 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409916 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409935 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409950 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409966 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409980 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.409994 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410008 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410022 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410035 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410051 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410063 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410079 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410093 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410106 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410118 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410131 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410143 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410156 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410169 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410182 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410193 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410206 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410218 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410230 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410267 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410279 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410291 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410305 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410317 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410345 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410359 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410379 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410392 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410405 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410416 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410429 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410441 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410454 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410466 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410478 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410490 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410502 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410514 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410526 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410539 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410592 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410605 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410618 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410633 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410647 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410723 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410747 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410782 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410813 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410826 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410890 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410906 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410918 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410949 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410966 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.410995 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.411030 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.411043 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.411081 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.411095 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.411110 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415504 4839 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415559 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415603 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415619 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415633 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415650 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415666 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415680 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415695 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415708 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415721 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415734 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415747 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415761 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415776 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415790 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415805 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415835 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415852 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415869 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415886 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415899 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415913 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415926 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415939 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415952 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415965 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415980 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.415992 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416007 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416020 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416033 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416046 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416059 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416133 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416147 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416160 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416173 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416186 4839 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416200 4839 reconstruct.go:97] "Volume reconstruction finished" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.416210 4839 reconciler.go:26] "Reconciler: start to sync state" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.428675 4839 manager.go:324] Recovery completed Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.436907 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.438892 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.438928 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.438936 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.439676 4839 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.439705 4839 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.439733 4839 state_mem.go:36] "Initialized new in-memory state store" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.448431 4839 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.451432 4839 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.451502 4839 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.451539 4839 kubelet.go:2335] "Starting kubelet main sync loop" Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.451626 4839 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.452462 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.452554 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.454200 4839 policy_none.go:49] "None policy: Start" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.455116 4839 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.455162 4839 state_mem.go:35] "Initializing new in-memory state store" Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.494499 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.510054 4839 manager.go:334] "Starting Device Plugin manager" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.510210 4839 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.510248 4839 server.go:79] "Starting device plugin registration server" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.510747 4839 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.510769 4839 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.511395 4839 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.511513 4839 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.511527 4839 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.519415 4839 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.551760 4839 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.551874 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.553485 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.553527 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.553539 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.553689 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.553876 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.553931 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.554622 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.554651 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.554664 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.554713 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.554731 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.554739 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.554761 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.554926 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.554968 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.555748 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.555783 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.555802 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.555961 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.556527 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.556597 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.556957 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557012 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557024 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557226 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557273 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557286 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557443 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557448 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557578 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557598 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557613 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.557613 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.558445 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.558482 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.558493 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.558771 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.558817 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.558889 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.558923 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.558932 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.560766 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.560830 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.560843 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.596446 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="400ms" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.611908 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617139 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617196 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617211 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617245 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617611 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617663 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617688 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617710 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617731 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617837 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.617861 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617896 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.617930 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.618050 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.618129 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.618183 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.618245 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.618289 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.618329 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.618370 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719518 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719583 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719602 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719621 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719641 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719660 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719682 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719700 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719722 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719736 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719753 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719772 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719769 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719805 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719790 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719808 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719851 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719833 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719895 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719868 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719903 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719921 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719993 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.719999 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.720002 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.720022 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.720049 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.720032 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.720029 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.720609 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.818987 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.820205 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.820239 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.820250 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.820275 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.820695 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.894726 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.917853 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.932610 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.950615 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: I0321 04:23:16.957034 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.965732 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-03957bb4622e5004f26f153371c6fddb77a147307f82bf8f87fd59c216a9ddbe WatchSource:0}: Error finding container 03957bb4622e5004f26f153371c6fddb77a147307f82bf8f87fd59c216a9ddbe: Status 404 returned error can't find the container with id 03957bb4622e5004f26f153371c6fddb77a147307f82bf8f87fd59c216a9ddbe Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.969047 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-0b610b8937ee54703f32eda1bf359d6f6b8bbb248c8fe9fdd4d8714f22767243 WatchSource:0}: Error finding container 0b610b8937ee54703f32eda1bf359d6f6b8bbb248c8fe9fdd4d8714f22767243: Status 404 returned error can't find the container with id 0b610b8937ee54703f32eda1bf359d6f6b8bbb248c8fe9fdd4d8714f22767243 Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.975074 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-014b541bf5b1bbe0d5a319c761c98dc491c7e5269fdcf13cd63afd8c06738cbc WatchSource:0}: Error finding container 014b541bf5b1bbe0d5a319c761c98dc491c7e5269fdcf13cd63afd8c06738cbc: Status 404 returned error can't find the container with id 014b541bf5b1bbe0d5a319c761c98dc491c7e5269fdcf13cd63afd8c06738cbc Mar 21 04:23:16 crc kubenswrapper[4839]: W0321 04:23:16.989588 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-754d299066d0ee97673eb4ca055e2e5aa667c8d5f8c83cff3cf369b224706549 WatchSource:0}: Error finding container 754d299066d0ee97673eb4ca055e2e5aa667c8d5f8c83cff3cf369b224706549: Status 404 returned error can't find the container with id 754d299066d0ee97673eb4ca055e2e5aa667c8d5f8c83cff3cf369b224706549 Mar 21 04:23:16 crc kubenswrapper[4839]: E0321 04:23:16.997905 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="800ms" Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.221079 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.223488 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.223535 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.223546 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.223589 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:23:17 crc kubenswrapper[4839]: E0321 04:23:17.224053 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Mar 21 04:23:17 crc kubenswrapper[4839]: W0321 04:23:17.376908 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:17 crc kubenswrapper[4839]: E0321 04:23:17.377007 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.394893 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.456340 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"014b541bf5b1bbe0d5a319c761c98dc491c7e5269fdcf13cd63afd8c06738cbc"} Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.462471 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0b610b8937ee54703f32eda1bf359d6f6b8bbb248c8fe9fdd4d8714f22767243"} Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.463475 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"03957bb4622e5004f26f153371c6fddb77a147307f82bf8f87fd59c216a9ddbe"} Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.464373 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a8e721ca03ad45ff241630168dd0d108390701f4a4d5343f2606e1ee00a9be73"} Mar 21 04:23:17 crc kubenswrapper[4839]: I0321 04:23:17.465298 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"754d299066d0ee97673eb4ca055e2e5aa667c8d5f8c83cff3cf369b224706549"} Mar 21 04:23:17 crc kubenswrapper[4839]: W0321 04:23:17.675951 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:17 crc kubenswrapper[4839]: E0321 04:23:17.676025 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:17 crc kubenswrapper[4839]: W0321 04:23:17.709336 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:17 crc kubenswrapper[4839]: E0321 04:23:17.709428 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:17 crc kubenswrapper[4839]: E0321 04:23:17.799464 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="1.6s" Mar 21 04:23:17 crc kubenswrapper[4839]: W0321 04:23:17.841013 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:17 crc kubenswrapper[4839]: E0321 04:23:17.841104 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.024674 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.026169 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.026231 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.026244 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.026275 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:23:18 crc kubenswrapper[4839]: E0321 04:23:18.026905 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.321344 4839 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:23:18 crc kubenswrapper[4839]: E0321 04:23:18.322975 4839 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.395132 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.469991 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e"} Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.470067 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1e036a62b31123ade40717ddff4b8b13971bdcda78062ebf348d49978c3c7a58"} Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.470086 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c"} Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.472015 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649" exitCode=0 Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.472088 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649"} Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.472224 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.473591 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.473820 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.473858 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.475492 4839 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf" exitCode=0 Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.475546 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf"} Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.475656 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.476949 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.476997 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.477014 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.477648 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.478701 4839 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="5b603ad92d6de73d444d98c4e9f38fd63b22eabeee61d7a3b00e8ca741016e68" exitCode=0 Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.478831 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.478873 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.478899 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.478914 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.478909 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"5b603ad92d6de73d444d98c4e9f38fd63b22eabeee61d7a3b00e8ca741016e68"} Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.481011 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.481191 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.481205 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.482764 4839 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441" exitCode=0 Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.482798 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441"} Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.482876 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.483801 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.483833 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:18 crc kubenswrapper[4839]: I0321 04:23:18.483846 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.395417 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:19 crc kubenswrapper[4839]: E0321 04:23:19.401430 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="3.2s" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.489726 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.490043 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a"} Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.490137 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367"} Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.490170 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd"} Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.491152 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.491211 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.491228 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.493982 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5"} Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.494042 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.494904 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.494925 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.494935 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.499130 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898"} Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.499190 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a"} Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.499203 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07"} Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.499213 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529"} Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.501188 4839 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5" exitCode=0 Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.501261 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5"} Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.501399 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.502719 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.502748 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.502759 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.507252 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5c5299f0598312d0ef997d7c51fad5c0b882bd65e5964794ac66179575373fcd"} Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.507386 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.508147 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.508174 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.508184 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.627350 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.628624 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.628668 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.628677 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.628705 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:23:19 crc kubenswrapper[4839]: E0321 04:23:19.629120 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.820381 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:19 crc kubenswrapper[4839]: I0321 04:23:19.836209 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:19 crc kubenswrapper[4839]: W0321 04:23:19.840454 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:23:19 crc kubenswrapper[4839]: E0321 04:23:19.840538 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.513238 4839 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c" exitCode=0 Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.513347 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.513330 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c"} Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.514114 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.514135 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.514146 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.517466 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"af0e08f67c4187e0c2d779bf55ac4af88b5145beda32bcfbd1ce7b4738e7d889"} Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.517561 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.517616 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.517661 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.517721 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.518122 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.518382 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.518405 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.518416 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.519422 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.519444 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.519453 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.519489 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.519503 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.519511 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.519443 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.519588 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:20 crc kubenswrapper[4839]: I0321 04:23:20.519599 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.361378 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.524724 4839 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.524782 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484"} Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.524817 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.524864 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.524897 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709"} Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.524769 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.524923 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83"} Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.524949 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd"} Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.524761 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.526426 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.526513 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.526632 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.526776 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.526817 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.526830 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.527582 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.527621 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.527631 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:21 crc kubenswrapper[4839]: I0321 04:23:21.881950 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.518791 4839 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.536025 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd"} Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.536092 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.536207 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.536297 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.537882 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.537939 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.537963 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.538088 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.538131 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.538128 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.538161 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.538192 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.538414 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.829746 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.832215 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.832273 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.832287 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.832324 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:23:22 crc kubenswrapper[4839]: I0321 04:23:22.841853 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 21 04:23:23 crc kubenswrapper[4839]: I0321 04:23:23.539642 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:23 crc kubenswrapper[4839]: I0321 04:23:23.541832 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:23 crc kubenswrapper[4839]: I0321 04:23:23.541918 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:23 crc kubenswrapper[4839]: I0321 04:23:23.541940 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:23 crc kubenswrapper[4839]: I0321 04:23:23.923199 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:23 crc kubenswrapper[4839]: I0321 04:23:23.923470 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:23 crc kubenswrapper[4839]: I0321 04:23:23.925071 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:23 crc kubenswrapper[4839]: I0321 04:23:23.925135 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:23 crc kubenswrapper[4839]: I0321 04:23:23.925158 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:24 crc kubenswrapper[4839]: I0321 04:23:24.250032 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:24 crc kubenswrapper[4839]: I0321 04:23:24.543399 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:24 crc kubenswrapper[4839]: I0321 04:23:24.543465 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:24 crc kubenswrapper[4839]: I0321 04:23:24.545263 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:24 crc kubenswrapper[4839]: I0321 04:23:24.545299 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:24 crc kubenswrapper[4839]: I0321 04:23:24.545310 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:24 crc kubenswrapper[4839]: I0321 04:23:24.545380 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:24 crc kubenswrapper[4839]: I0321 04:23:24.545419 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:24 crc kubenswrapper[4839]: I0321 04:23:24.545441 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:25 crc kubenswrapper[4839]: I0321 04:23:25.450423 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:25 crc kubenswrapper[4839]: I0321 04:23:25.450721 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:25 crc kubenswrapper[4839]: I0321 04:23:25.452412 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:25 crc kubenswrapper[4839]: I0321 04:23:25.452462 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:25 crc kubenswrapper[4839]: I0321 04:23:25.452472 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:26 crc kubenswrapper[4839]: E0321 04:23:26.519600 4839 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:23:27 crc kubenswrapper[4839]: I0321 04:23:27.251267 4839 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:23:27 crc kubenswrapper[4839]: I0321 04:23:27.251367 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 04:23:27 crc kubenswrapper[4839]: I0321 04:23:27.472487 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 21 04:23:27 crc kubenswrapper[4839]: I0321 04:23:27.472762 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:27 crc kubenswrapper[4839]: I0321 04:23:27.474091 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:27 crc kubenswrapper[4839]: I0321 04:23:27.474132 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:27 crc kubenswrapper[4839]: I0321 04:23:27.474147 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:30 crc kubenswrapper[4839]: W0321 04:23:30.254945 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 21 04:23:30 crc kubenswrapper[4839]: I0321 04:23:30.255080 4839 trace.go:236] Trace[1885837127]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Mar-2026 04:23:20.253) (total time: 10001ms): Mar 21 04:23:30 crc kubenswrapper[4839]: Trace[1885837127]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (04:23:30.254) Mar 21 04:23:30 crc kubenswrapper[4839]: Trace[1885837127]: [10.001844256s] [10.001844256s] END Mar 21 04:23:30 crc kubenswrapper[4839]: E0321 04:23:30.255116 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 21 04:23:30 crc kubenswrapper[4839]: W0321 04:23:30.270715 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 21 04:23:30 crc kubenswrapper[4839]: I0321 04:23:30.270822 4839 trace.go:236] Trace[2096808232]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Mar-2026 04:23:20.269) (total time: 10001ms): Mar 21 04:23:30 crc kubenswrapper[4839]: Trace[2096808232]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (04:23:30.270) Mar 21 04:23:30 crc kubenswrapper[4839]: Trace[2096808232]: [10.001268971s] [10.001268971s] END Mar 21 04:23:30 crc kubenswrapper[4839]: E0321 04:23:30.270845 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 21 04:23:30 crc kubenswrapper[4839]: I0321 04:23:30.394995 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 21 04:23:30 crc kubenswrapper[4839]: W0321 04:23:30.693793 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 21 04:23:30 crc kubenswrapper[4839]: I0321 04:23:30.693948 4839 trace.go:236] Trace[925494500]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Mar-2026 04:23:20.692) (total time: 10001ms): Mar 21 04:23:30 crc kubenswrapper[4839]: Trace[925494500]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (04:23:30.693) Mar 21 04:23:30 crc kubenswrapper[4839]: Trace[925494500]: [10.001568458s] [10.001568458s] END Mar 21 04:23:30 crc kubenswrapper[4839]: E0321 04:23:30.693990 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 21 04:23:30 crc kubenswrapper[4839]: E0321 04:23:30.887954 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:30Z is after 2026-02-23T05:33:13Z" node="crc" Mar 21 04:23:30 crc kubenswrapper[4839]: E0321 04:23:30.889021 4839 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:30Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ec088a75d2ad4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,LastTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:23:30 crc kubenswrapper[4839]: W0321 04:23:30.889251 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:30Z is after 2026-02-23T05:33:13Z Mar 21 04:23:30 crc kubenswrapper[4839]: E0321 04:23:30.889327 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:30 crc kubenswrapper[4839]: E0321 04:23:30.889649 4839 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:30 crc kubenswrapper[4839]: E0321 04:23:30.890558 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:30Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 21 04:23:30 crc kubenswrapper[4839]: I0321 04:23:30.899660 4839 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 21 04:23:30 crc kubenswrapper[4839]: I0321 04:23:30.899722 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 21 04:23:30 crc kubenswrapper[4839]: I0321 04:23:30.904502 4839 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 21 04:23:30 crc kubenswrapper[4839]: I0321 04:23:30.904591 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.368405 4839 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]log ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]etcd ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/generic-apiserver-start-informers ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/priority-and-fairness-filter ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/start-apiextensions-informers ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/start-apiextensions-controllers ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/crd-informer-synced ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/start-system-namespaces-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 21 04:23:31 crc kubenswrapper[4839]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 21 04:23:31 crc kubenswrapper[4839]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/bootstrap-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/start-kube-aggregator-informers ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/apiservice-registration-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/apiservice-discovery-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]autoregister-completion ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/apiservice-openapi-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 21 04:23:31 crc kubenswrapper[4839]: livez check failed Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.368489 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.396462 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:31Z is after 2026-02-23T05:33:13Z Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.566726 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.568959 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="af0e08f67c4187e0c2d779bf55ac4af88b5145beda32bcfbd1ce7b4738e7d889" exitCode=255 Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.569010 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"af0e08f67c4187e0c2d779bf55ac4af88b5145beda32bcfbd1ce7b4738e7d889"} Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.569189 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.570323 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.570381 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.570400 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.571274 4839 scope.go:117] "RemoveContainer" containerID="af0e08f67c4187e0c2d779bf55ac4af88b5145beda32bcfbd1ce7b4738e7d889" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.888986 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.889168 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.890455 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.890558 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:31 crc kubenswrapper[4839]: I0321 04:23:31.890641 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.398218 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:32Z is after 2026-02-23T05:33:13Z Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.575482 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.577696 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7476f5ffcd4543713ca21f82ab34782837d059df6a823fa21380af4d37875843"} Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.577910 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.579214 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.579272 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.579285 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.873839 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.874014 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.875113 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.875186 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.875214 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:32 crc kubenswrapper[4839]: I0321 04:23:32.891761 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.399475 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:33Z is after 2026-02-23T05:33:13Z Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.583760 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.584520 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.587233 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7476f5ffcd4543713ca21f82ab34782837d059df6a823fa21380af4d37875843" exitCode=255 Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.587337 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7476f5ffcd4543713ca21f82ab34782837d059df6a823fa21380af4d37875843"} Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.587393 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.587446 4839 scope.go:117] "RemoveContainer" containerID="af0e08f67c4187e0c2d779bf55ac4af88b5145beda32bcfbd1ce7b4738e7d889" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.587597 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.590123 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.590179 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.590193 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.591257 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.591320 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.591335 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.592324 4839 scope.go:117] "RemoveContainer" containerID="7476f5ffcd4543713ca21f82ab34782837d059df6a823fa21380af4d37875843" Mar 21 04:23:33 crc kubenswrapper[4839]: E0321 04:23:33.592490 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:23:33 crc kubenswrapper[4839]: I0321 04:23:33.679367 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:33 crc kubenswrapper[4839]: W0321 04:23:33.961108 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:33Z is after 2026-02-23T05:33:13Z Mar 21 04:23:33 crc kubenswrapper[4839]: E0321 04:23:33.961245 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:34 crc kubenswrapper[4839]: W0321 04:23:34.372335 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:34Z is after 2026-02-23T05:33:13Z Mar 21 04:23:34 crc kubenswrapper[4839]: E0321 04:23:34.372413 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:34 crc kubenswrapper[4839]: I0321 04:23:34.397400 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:34Z is after 2026-02-23T05:33:13Z Mar 21 04:23:34 crc kubenswrapper[4839]: I0321 04:23:34.593176 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 21 04:23:34 crc kubenswrapper[4839]: I0321 04:23:34.595894 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:34 crc kubenswrapper[4839]: I0321 04:23:34.596844 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:34 crc kubenswrapper[4839]: I0321 04:23:34.596894 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:34 crc kubenswrapper[4839]: I0321 04:23:34.596905 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:34 crc kubenswrapper[4839]: I0321 04:23:34.597332 4839 scope.go:117] "RemoveContainer" containerID="7476f5ffcd4543713ca21f82ab34782837d059df6a823fa21380af4d37875843" Mar 21 04:23:34 crc kubenswrapper[4839]: E0321 04:23:34.597496 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:23:35 crc kubenswrapper[4839]: I0321 04:23:35.399479 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:35Z is after 2026-02-23T05:33:13Z Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.373165 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.373426 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.375300 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.375373 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.375396 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.379792 4839 scope.go:117] "RemoveContainer" containerID="7476f5ffcd4543713ca21f82ab34782837d059df6a823fa21380af4d37875843" Mar 21 04:23:36 crc kubenswrapper[4839]: E0321 04:23:36.380439 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.384927 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.399093 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:36Z is after 2026-02-23T05:33:13Z Mar 21 04:23:36 crc kubenswrapper[4839]: E0321 04:23:36.519766 4839 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.601239 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.602834 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.603406 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.603746 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:36 crc kubenswrapper[4839]: I0321 04:23:36.605167 4839 scope.go:117] "RemoveContainer" containerID="7476f5ffcd4543713ca21f82ab34782837d059df6a823fa21380af4d37875843" Mar 21 04:23:36 crc kubenswrapper[4839]: E0321 04:23:36.605839 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:23:36 crc kubenswrapper[4839]: W0321 04:23:36.659767 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:36Z is after 2026-02-23T05:33:13Z Mar 21 04:23:36 crc kubenswrapper[4839]: E0321 04:23:36.659868 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:36Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.251780 4839 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.251889 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.288120 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.289751 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.289833 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.289854 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.289906 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:23:37 crc kubenswrapper[4839]: E0321 04:23:37.293157 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:37Z is after 2026-02-23T05:33:13Z" node="crc" Mar 21 04:23:37 crc kubenswrapper[4839]: E0321 04:23:37.295647 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:37Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.397844 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:37Z is after 2026-02-23T05:33:13Z Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.770631 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.770894 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.772965 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.773010 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.773022 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:37 crc kubenswrapper[4839]: I0321 04:23:37.773799 4839 scope.go:117] "RemoveContainer" containerID="7476f5ffcd4543713ca21f82ab34782837d059df6a823fa21380af4d37875843" Mar 21 04:23:37 crc kubenswrapper[4839]: E0321 04:23:37.774027 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:23:38 crc kubenswrapper[4839]: I0321 04:23:38.399888 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:38Z is after 2026-02-23T05:33:13Z Mar 21 04:23:39 crc kubenswrapper[4839]: I0321 04:23:39.073852 4839 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:23:39 crc kubenswrapper[4839]: E0321 04:23:39.077489 4839 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:39Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:39 crc kubenswrapper[4839]: I0321 04:23:39.398673 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:39Z is after 2026-02-23T05:33:13Z Mar 21 04:23:40 crc kubenswrapper[4839]: I0321 04:23:40.397560 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:40Z is after 2026-02-23T05:33:13Z Mar 21 04:23:40 crc kubenswrapper[4839]: E0321 04:23:40.895189 4839 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:40Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ec088a75d2ad4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,LastTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:23:41 crc kubenswrapper[4839]: W0321 04:23:41.373355 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:41Z is after 2026-02-23T05:33:13Z Mar 21 04:23:41 crc kubenswrapper[4839]: E0321 04:23:41.373440 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:41 crc kubenswrapper[4839]: I0321 04:23:41.398009 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:41Z is after 2026-02-23T05:33:13Z Mar 21 04:23:42 crc kubenswrapper[4839]: I0321 04:23:42.398778 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:42Z is after 2026-02-23T05:33:13Z Mar 21 04:23:43 crc kubenswrapper[4839]: W0321 04:23:43.046506 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:43Z is after 2026-02-23T05:33:13Z Mar 21 04:23:43 crc kubenswrapper[4839]: E0321 04:23:43.046659 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:43Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:43 crc kubenswrapper[4839]: I0321 04:23:43.399191 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:43Z is after 2026-02-23T05:33:13Z Mar 21 04:23:43 crc kubenswrapper[4839]: W0321 04:23:43.916269 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:43Z is after 2026-02-23T05:33:13Z Mar 21 04:23:43 crc kubenswrapper[4839]: E0321 04:23:43.916398 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:43Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:44 crc kubenswrapper[4839]: I0321 04:23:44.294079 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:44 crc kubenswrapper[4839]: I0321 04:23:44.295687 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:44 crc kubenswrapper[4839]: I0321 04:23:44.295729 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:44 crc kubenswrapper[4839]: I0321 04:23:44.295741 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:44 crc kubenswrapper[4839]: I0321 04:23:44.295768 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:23:44 crc kubenswrapper[4839]: E0321 04:23:44.301135 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:44Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 21 04:23:44 crc kubenswrapper[4839]: E0321 04:23:44.301556 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:44Z is after 2026-02-23T05:33:13Z" node="crc" Mar 21 04:23:44 crc kubenswrapper[4839]: I0321 04:23:44.399858 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:44Z is after 2026-02-23T05:33:13Z Mar 21 04:23:45 crc kubenswrapper[4839]: I0321 04:23:45.398703 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:45Z is after 2026-02-23T05:33:13Z Mar 21 04:23:46 crc kubenswrapper[4839]: I0321 04:23:46.399295 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:46Z is after 2026-02-23T05:33:13Z Mar 21 04:23:46 crc kubenswrapper[4839]: E0321 04:23:46.519846 4839 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.251449 4839 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.251852 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.252000 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.252226 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.253652 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.253713 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.253734 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.254483 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"1e036a62b31123ade40717ddff4b8b13971bdcda78062ebf348d49978c3c7a58"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.254762 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://1e036a62b31123ade40717ddff4b8b13971bdcda78062ebf348d49978c3c7a58" gracePeriod=30 Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.397695 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:47Z is after 2026-02-23T05:33:13Z Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.635087 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.635693 4839 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1e036a62b31123ade40717ddff4b8b13971bdcda78062ebf348d49978c3c7a58" exitCode=255 Mar 21 04:23:47 crc kubenswrapper[4839]: I0321 04:23:47.635769 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1e036a62b31123ade40717ddff4b8b13971bdcda78062ebf348d49978c3c7a58"} Mar 21 04:23:48 crc kubenswrapper[4839]: W0321 04:23:48.313208 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:48Z is after 2026-02-23T05:33:13Z Mar 21 04:23:48 crc kubenswrapper[4839]: E0321 04:23:48.313319 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:48Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:48 crc kubenswrapper[4839]: I0321 04:23:48.397037 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:48Z is after 2026-02-23T05:33:13Z Mar 21 04:23:48 crc kubenswrapper[4839]: I0321 04:23:48.641734 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 21 04:23:48 crc kubenswrapper[4839]: I0321 04:23:48.642599 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505"} Mar 21 04:23:48 crc kubenswrapper[4839]: I0321 04:23:48.642680 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:48 crc kubenswrapper[4839]: I0321 04:23:48.643772 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:48 crc kubenswrapper[4839]: I0321 04:23:48.643806 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:48 crc kubenswrapper[4839]: I0321 04:23:48.643816 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:49 crc kubenswrapper[4839]: I0321 04:23:49.397606 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:49Z is after 2026-02-23T05:33:13Z Mar 21 04:23:49 crc kubenswrapper[4839]: I0321 04:23:49.647891 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:49 crc kubenswrapper[4839]: I0321 04:23:49.648946 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:49 crc kubenswrapper[4839]: I0321 04:23:49.649000 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:49 crc kubenswrapper[4839]: I0321 04:23:49.649016 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:50 crc kubenswrapper[4839]: I0321 04:23:50.397271 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:50Z is after 2026-02-23T05:33:13Z Mar 21 04:23:50 crc kubenswrapper[4839]: I0321 04:23:50.452126 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:50 crc kubenswrapper[4839]: I0321 04:23:50.454293 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:50 crc kubenswrapper[4839]: I0321 04:23:50.454331 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:50 crc kubenswrapper[4839]: I0321 04:23:50.454343 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:50 crc kubenswrapper[4839]: I0321 04:23:50.454983 4839 scope.go:117] "RemoveContainer" containerID="7476f5ffcd4543713ca21f82ab34782837d059df6a823fa21380af4d37875843" Mar 21 04:23:50 crc kubenswrapper[4839]: I0321 04:23:50.660671 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 21 04:23:50 crc kubenswrapper[4839]: E0321 04:23:50.899415 4839 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:50Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ec088a75d2ad4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,LastTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.301720 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.302804 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.302851 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.302869 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.302902 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:23:51 crc kubenswrapper[4839]: E0321 04:23:51.304398 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:51Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 21 04:23:51 crc kubenswrapper[4839]: E0321 04:23:51.305779 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:51Z is after 2026-02-23T05:33:13Z" node="crc" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.396586 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:51Z is after 2026-02-23T05:33:13Z Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.666513 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.666935 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.668480 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2edb841c97e50e44d1c47b38425979e96688c3d536fe263be8db34fe4f7ec6ce" exitCode=255 Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.668531 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2edb841c97e50e44d1c47b38425979e96688c3d536fe263be8db34fe4f7ec6ce"} Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.668602 4839 scope.go:117] "RemoveContainer" containerID="7476f5ffcd4543713ca21f82ab34782837d059df6a823fa21380af4d37875843" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.668724 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.669609 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.669669 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.669690 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:51 crc kubenswrapper[4839]: I0321 04:23:51.670524 4839 scope.go:117] "RemoveContainer" containerID="2edb841c97e50e44d1c47b38425979e96688c3d536fe263be8db34fe4f7ec6ce" Mar 21 04:23:51 crc kubenswrapper[4839]: E0321 04:23:51.671090 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:23:52 crc kubenswrapper[4839]: I0321 04:23:52.397422 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:52Z is after 2026-02-23T05:33:13Z Mar 21 04:23:52 crc kubenswrapper[4839]: I0321 04:23:52.672858 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.398666 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:53Z is after 2026-02-23T05:33:13Z Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.679470 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.679759 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.680934 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.680963 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.680973 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.681439 4839 scope.go:117] "RemoveContainer" containerID="2edb841c97e50e44d1c47b38425979e96688c3d536fe263be8db34fe4f7ec6ce" Mar 21 04:23:53 crc kubenswrapper[4839]: E0321 04:23:53.681636 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.923270 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.923458 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.924718 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.924765 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:53 crc kubenswrapper[4839]: I0321 04:23:53.924781 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:54 crc kubenswrapper[4839]: I0321 04:23:54.250401 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:23:54 crc kubenswrapper[4839]: I0321 04:23:54.398054 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:54Z is after 2026-02-23T05:33:13Z Mar 21 04:23:54 crc kubenswrapper[4839]: I0321 04:23:54.679462 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:54 crc kubenswrapper[4839]: I0321 04:23:54.680295 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:54 crc kubenswrapper[4839]: I0321 04:23:54.680336 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:54 crc kubenswrapper[4839]: I0321 04:23:54.680354 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:55 crc kubenswrapper[4839]: I0321 04:23:55.178547 4839 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:23:55 crc kubenswrapper[4839]: E0321 04:23:55.182192 4839 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:55 crc kubenswrapper[4839]: E0321 04:23:55.183417 4839 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 21 04:23:55 crc kubenswrapper[4839]: I0321 04:23:55.396966 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:55Z is after 2026-02-23T05:33:13Z Mar 21 04:23:56 crc kubenswrapper[4839]: I0321 04:23:56.397033 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:56Z is after 2026-02-23T05:33:13Z Mar 21 04:23:56 crc kubenswrapper[4839]: E0321 04:23:56.519964 4839 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:23:57 crc kubenswrapper[4839]: I0321 04:23:57.251434 4839 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:23:57 crc kubenswrapper[4839]: I0321 04:23:57.251557 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 04:23:57 crc kubenswrapper[4839]: I0321 04:23:57.397615 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:57Z is after 2026-02-23T05:33:13Z Mar 21 04:23:57 crc kubenswrapper[4839]: I0321 04:23:57.770648 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:23:57 crc kubenswrapper[4839]: I0321 04:23:57.770976 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:57 crc kubenswrapper[4839]: I0321 04:23:57.772470 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:57 crc kubenswrapper[4839]: I0321 04:23:57.772524 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:57 crc kubenswrapper[4839]: I0321 04:23:57.772547 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:57 crc kubenswrapper[4839]: I0321 04:23:57.773177 4839 scope.go:117] "RemoveContainer" containerID="2edb841c97e50e44d1c47b38425979e96688c3d536fe263be8db34fe4f7ec6ce" Mar 21 04:23:57 crc kubenswrapper[4839]: E0321 04:23:57.773357 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:23:58 crc kubenswrapper[4839]: I0321 04:23:58.306374 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:23:58 crc kubenswrapper[4839]: I0321 04:23:58.307715 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:23:58 crc kubenswrapper[4839]: I0321 04:23:58.307759 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:23:58 crc kubenswrapper[4839]: I0321 04:23:58.307770 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:23:58 crc kubenswrapper[4839]: I0321 04:23:58.307795 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:23:58 crc kubenswrapper[4839]: E0321 04:23:58.308044 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:58Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 21 04:23:58 crc kubenswrapper[4839]: E0321 04:23:58.310492 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:58Z is after 2026-02-23T05:33:13Z" node="crc" Mar 21 04:23:58 crc kubenswrapper[4839]: I0321 04:23:58.400252 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:58Z is after 2026-02-23T05:33:13Z Mar 21 04:23:58 crc kubenswrapper[4839]: W0321 04:23:58.917117 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:58Z is after 2026-02-23T05:33:13Z Mar 21 04:23:58 crc kubenswrapper[4839]: E0321 04:23:58.917224 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:23:59 crc kubenswrapper[4839]: I0321 04:23:59.397425 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:23:59Z is after 2026-02-23T05:33:13Z Mar 21 04:24:00 crc kubenswrapper[4839]: I0321 04:24:00.397844 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:00Z is after 2026-02-23T05:33:13Z Mar 21 04:24:00 crc kubenswrapper[4839]: E0321 04:24:00.902729 4839 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:00Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ec088a75d2ad4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,LastTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:01 crc kubenswrapper[4839]: W0321 04:24:01.139787 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:01Z is after 2026-02-23T05:33:13Z Mar 21 04:24:01 crc kubenswrapper[4839]: E0321 04:24:01.139868 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:24:01 crc kubenswrapper[4839]: I0321 04:24:01.398219 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:01Z is after 2026-02-23T05:33:13Z Mar 21 04:24:02 crc kubenswrapper[4839]: I0321 04:24:02.397400 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:02Z is after 2026-02-23T05:33:13Z Mar 21 04:24:03 crc kubenswrapper[4839]: I0321 04:24:03.398560 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:03Z is after 2026-02-23T05:33:13Z Mar 21 04:24:04 crc kubenswrapper[4839]: I0321 04:24:04.398711 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:04Z is after 2026-02-23T05:33:13Z Mar 21 04:24:04 crc kubenswrapper[4839]: W0321 04:24:04.736322 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:04Z is after 2026-02-23T05:33:13Z Mar 21 04:24:04 crc kubenswrapper[4839]: E0321 04:24:04.736406 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 21 04:24:05 crc kubenswrapper[4839]: I0321 04:24:05.311720 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:05 crc kubenswrapper[4839]: E0321 04:24:05.311873 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:05Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 21 04:24:05 crc kubenswrapper[4839]: I0321 04:24:05.313089 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:05 crc kubenswrapper[4839]: I0321 04:24:05.313139 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:05 crc kubenswrapper[4839]: I0321 04:24:05.313152 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:05 crc kubenswrapper[4839]: I0321 04:24:05.313180 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:24:05 crc kubenswrapper[4839]: E0321 04:24:05.315708 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:05Z is after 2026-02-23T05:33:13Z" node="crc" Mar 21 04:24:05 crc kubenswrapper[4839]: I0321 04:24:05.400080 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:05Z is after 2026-02-23T05:33:13Z Mar 21 04:24:06 crc kubenswrapper[4839]: I0321 04:24:06.399011 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:06Z is after 2026-02-23T05:33:13Z Mar 21 04:24:06 crc kubenswrapper[4839]: E0321 04:24:06.520114 4839 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:24:07 crc kubenswrapper[4839]: I0321 04:24:07.250927 4839 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:24:07 crc kubenswrapper[4839]: I0321 04:24:07.251095 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:24:07 crc kubenswrapper[4839]: I0321 04:24:07.398987 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:24:07Z is after 2026-02-23T05:33:13Z Mar 21 04:24:08 crc kubenswrapper[4839]: I0321 04:24:08.401832 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:09 crc kubenswrapper[4839]: W0321 04:24:09.339294 4839 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 21 04:24:09 crc kubenswrapper[4839]: E0321 04:24:09.339365 4839 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 21 04:24:09 crc kubenswrapper[4839]: I0321 04:24:09.398305 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:09 crc kubenswrapper[4839]: I0321 04:24:09.398364 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 21 04:24:09 crc kubenswrapper[4839]: I0321 04:24:09.398872 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:09 crc kubenswrapper[4839]: I0321 04:24:09.400715 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:09 crc kubenswrapper[4839]: I0321 04:24:09.400764 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:09 crc kubenswrapper[4839]: I0321 04:24:09.400777 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:10 crc kubenswrapper[4839]: I0321 04:24:10.401674 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.908178 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088a75d2ad4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,LastTimestamp:2026-03-21 04:23:16.386540244 +0000 UTC m=+0.714326930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.914858 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c603a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438917178 +0000 UTC m=+0.766703854,LastTimestamp:2026-03-21 04:23:16.438917178 +0000 UTC m=+0.766703854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.919927 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c9f1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438933278 +0000 UTC m=+0.766719954,LastTimestamp:2026-03-21 04:23:16.438933278 +0000 UTC m=+0.766719954,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.925923 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7cbf3b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438941499 +0000 UTC m=+0.766728175,LastTimestamp:2026-03-21 04:23:16.438941499 +0000 UTC m=+0.766728175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.930305 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088af12e94c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.515891532 +0000 UTC m=+0.843678208,LastTimestamp:2026-03-21 04:23:16.515891532 +0000 UTC m=+0.843678208,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.935276 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c603a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c603a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438917178 +0000 UTC m=+0.766703854,LastTimestamp:2026-03-21 04:23:16.553508373 +0000 UTC m=+0.881295049,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.940732 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c9f1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c9f1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438933278 +0000 UTC m=+0.766719954,LastTimestamp:2026-03-21 04:23:16.553535404 +0000 UTC m=+0.881322080,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.944993 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7cbf3b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7cbf3b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438941499 +0000 UTC m=+0.766728175,LastTimestamp:2026-03-21 04:23:16.553545814 +0000 UTC m=+0.881332490,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.949391 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c603a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c603a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438917178 +0000 UTC m=+0.766703854,LastTimestamp:2026-03-21 04:23:16.554641629 +0000 UTC m=+0.882428305,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.953477 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c9f1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c9f1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438933278 +0000 UTC m=+0.766719954,LastTimestamp:2026-03-21 04:23:16.55465985 +0000 UTC m=+0.882446526,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.957939 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7cbf3b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7cbf3b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438941499 +0000 UTC m=+0.766728175,LastTimestamp:2026-03-21 04:23:16.55466963 +0000 UTC m=+0.882456306,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.961452 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c603a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c603a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438917178 +0000 UTC m=+0.766703854,LastTimestamp:2026-03-21 04:23:16.554725991 +0000 UTC m=+0.882512667,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.964790 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c9f1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c9f1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438933278 +0000 UTC m=+0.766719954,LastTimestamp:2026-03-21 04:23:16.554736102 +0000 UTC m=+0.882522778,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.968769 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7cbf3b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7cbf3b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438941499 +0000 UTC m=+0.766728175,LastTimestamp:2026-03-21 04:23:16.554744022 +0000 UTC m=+0.882530698,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.972210 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c603a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c603a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438917178 +0000 UTC m=+0.766703854,LastTimestamp:2026-03-21 04:23:16.555774686 +0000 UTC m=+0.883561382,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.979734 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c9f1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c9f1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438933278 +0000 UTC m=+0.766719954,LastTimestamp:2026-03-21 04:23:16.555793796 +0000 UTC m=+0.883580492,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.985185 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7cbf3b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7cbf3b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438941499 +0000 UTC m=+0.766728175,LastTimestamp:2026-03-21 04:23:16.555810127 +0000 UTC m=+0.883596823,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.989407 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c603a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c603a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438917178 +0000 UTC m=+0.766703854,LastTimestamp:2026-03-21 04:23:16.556993344 +0000 UTC m=+0.884780020,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.993316 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c9f1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c9f1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438933278 +0000 UTC m=+0.766719954,LastTimestamp:2026-03-21 04:23:16.557019665 +0000 UTC m=+0.884806341,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:10 crc kubenswrapper[4839]: E0321 04:24:10.997005 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7cbf3b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7cbf3b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438941499 +0000 UTC m=+0.766728175,LastTimestamp:2026-03-21 04:23:16.557028815 +0000 UTC m=+0.884815491,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.000858 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c603a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c603a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438917178 +0000 UTC m=+0.766703854,LastTimestamp:2026-03-21 04:23:16.55725637 +0000 UTC m=+0.885043046,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.005429 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c9f1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c9f1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438933278 +0000 UTC m=+0.766719954,LastTimestamp:2026-03-21 04:23:16.557281751 +0000 UTC m=+0.885068427,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.009348 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7cbf3b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7cbf3b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438941499 +0000 UTC m=+0.766728175,LastTimestamp:2026-03-21 04:23:16.557292121 +0000 UTC m=+0.885078797,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.016353 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c603a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c603a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438917178 +0000 UTC m=+0.766703854,LastTimestamp:2026-03-21 04:23:16.557578047 +0000 UTC m=+0.885364723,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.019923 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ec088aa7c9f1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ec088aa7c9f1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.438933278 +0000 UTC m=+0.766719954,LastTimestamp:2026-03-21 04:23:16.557608958 +0000 UTC m=+0.885395634,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.024624 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec088c92a6187 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.953637255 +0000 UTC m=+1.281423961,LastTimestamp:2026-03-21 04:23:16.953637255 +0000 UTC m=+1.281423961,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.028689 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec088ca45edee openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.972219886 +0000 UTC m=+1.300006562,LastTimestamp:2026-03-21 04:23:16.972219886 +0000 UTC m=+1.300006562,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.033107 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec088ca490f8d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.972425101 +0000 UTC m=+1.300211807,LastTimestamp:2026-03-21 04:23:16.972425101 +0000 UTC m=+1.300211807,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.038256 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec088cb0920ee openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.985012462 +0000 UTC m=+1.312799138,LastTimestamp:2026-03-21 04:23:16.985012462 +0000 UTC m=+1.312799138,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.040326 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec088cb831332 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:16.993004338 +0000 UTC m=+1.320791014,LastTimestamp:2026-03-21 04:23:16.993004338 +0000 UTC m=+1.320791014,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.041841 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec088fbfb6018 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.806194712 +0000 UTC m=+2.133981388,LastTimestamp:2026-03-21 04:23:17.806194712 +0000 UTC m=+2.133981388,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.045962 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec088fbfb5fdc openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.806194652 +0000 UTC m=+2.133981348,LastTimestamp:2026-03-21 04:23:17.806194652 +0000 UTC m=+2.133981348,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.049501 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec088fbfcea52 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.806295634 +0000 UTC m=+2.134082310,LastTimestamp:2026-03-21 04:23:17.806295634 +0000 UTC m=+2.134082310,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.053134 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec088fbfd015e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.806301534 +0000 UTC m=+2.134088250,LastTimestamp:2026-03-21 04:23:17.806301534 +0000 UTC m=+2.134088250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.056563 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec088fc04146d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.806765165 +0000 UTC m=+2.134551851,LastTimestamp:2026-03-21 04:23:17.806765165 +0000 UTC m=+2.134551851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.059746 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec088fcbaae6e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.818732142 +0000 UTC m=+2.146518818,LastTimestamp:2026-03-21 04:23:17.818732142 +0000 UTC m=+2.146518818,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.063481 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec088fcd112c0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.820199616 +0000 UTC m=+2.147986292,LastTimestamp:2026-03-21 04:23:17.820199616 +0000 UTC m=+2.147986292,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.066656 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec088fce323d2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.821383634 +0000 UTC m=+2.149170310,LastTimestamp:2026-03-21 04:23:17.821383634 +0000 UTC m=+2.149170310,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.070149 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec088fceaa789 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.821876105 +0000 UTC m=+2.149662781,LastTimestamp:2026-03-21 04:23:17.821876105 +0000 UTC m=+2.149662781,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.074037 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec088fced4d8f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.822049679 +0000 UTC m=+2.149836365,LastTimestamp:2026-03-21 04:23:17.822049679 +0000 UTC m=+2.149836365,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.077344 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec088fcf0454e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.822244174 +0000 UTC m=+2.150030850,LastTimestamp:2026-03-21 04:23:17.822244174 +0000 UTC m=+2.150030850,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.081687 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec0890e4a5fa6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.11336183 +0000 UTC m=+2.441148506,LastTimestamp:2026-03-21 04:23:18.11336183 +0000 UTC m=+2.441148506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.085312 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec0890f18c62d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.126888493 +0000 UTC m=+2.454675189,LastTimestamp:2026-03-21 04:23:18.126888493 +0000 UTC m=+2.454675189,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.088813 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec0890f2fb464 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.128391268 +0000 UTC m=+2.456177964,LastTimestamp:2026-03-21 04:23:18.128391268 +0000 UTC m=+2.456177964,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.092029 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec0891a6a9381 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.316798849 +0000 UTC m=+2.644585525,LastTimestamp:2026-03-21 04:23:18.316798849 +0000 UTC m=+2.644585525,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.095281 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec0891b1325b4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.327846324 +0000 UTC m=+2.655633000,LastTimestamp:2026-03-21 04:23:18.327846324 +0000 UTC m=+2.655633000,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.098545 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec0891b243a70 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.328965744 +0000 UTC m=+2.656752410,LastTimestamp:2026-03-21 04:23:18.328965744 +0000 UTC m=+2.656752410,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.102112 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08923fcecf5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.477384949 +0000 UTC m=+2.805171635,LastTimestamp:2026-03-21 04:23:18.477384949 +0000 UTC m=+2.805171635,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.105468 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089241ff99f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.479681951 +0000 UTC m=+2.807468647,LastTimestamp:2026-03-21 04:23:18.479681951 +0000 UTC m=+2.807468647,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.108958 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec089244654e8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.482195688 +0000 UTC m=+2.809982374,LastTimestamp:2026-03-21 04:23:18.482195688 +0000 UTC m=+2.809982374,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.112252 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec08924997f3e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.487646014 +0000 UTC m=+2.815432700,LastTimestamp:2026-03-21 04:23:18.487646014 +0000 UTC m=+2.815432700,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.115598 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec0892865f19e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.551376286 +0000 UTC m=+2.879162962,LastTimestamp:2026-03-21 04:23:18.551376286 +0000 UTC m=+2.879162962,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.118738 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec0892a1a61ea openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.57997873 +0000 UTC m=+2.907765426,LastTimestamp:2026-03-21 04:23:18.57997873 +0000 UTC m=+2.907765426,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.122033 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08930386c23 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.682610723 +0000 UTC m=+3.010397409,LastTimestamp:2026-03-21 04:23:18.682610723 +0000 UTC m=+3.010397409,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.125237 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec089304673d5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.683530197 +0000 UTC m=+3.011316873,LastTimestamp:2026-03-21 04:23:18.683530197 +0000 UTC m=+3.011316873,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.129284 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec0893049ffc9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.683762633 +0000 UTC m=+3.011549309,LastTimestamp:2026-03-21 04:23:18.683762633 +0000 UTC m=+3.011549309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.133067 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec089304beb0d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.683888397 +0000 UTC m=+3.011675073,LastTimestamp:2026-03-21 04:23:18.683888397 +0000 UTC m=+3.011675073,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.136824 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ec0893185ac4c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.704450636 +0000 UTC m=+3.032237332,LastTimestamp:2026-03-21 04:23:18.704450636 +0000 UTC m=+3.032237332,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.140382 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08931d8d544 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.709900612 +0000 UTC m=+3.037687288,LastTimestamp:2026-03-21 04:23:18.709900612 +0000 UTC m=+3.037687288,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.142098 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec08931d91766 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.709917542 +0000 UTC m=+3.037704238,LastTimestamp:2026-03-21 04:23:18.709917542 +0000 UTC m=+3.037704238,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.144983 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec08931d8d490 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.709900432 +0000 UTC m=+3.037687128,LastTimestamp:2026-03-21 04:23:18.709900432 +0000 UTC m=+3.037687128,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.148164 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec08931e84387 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.710911879 +0000 UTC m=+3.038698565,LastTimestamp:2026-03-21 04:23:18.710911879 +0000 UTC m=+3.038698565,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.151369 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08931f4b4b1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.711727281 +0000 UTC m=+3.039513957,LastTimestamp:2026-03-21 04:23:18.711727281 +0000 UTC m=+3.039513957,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.154452 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec0893f1b027e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.932341374 +0000 UTC m=+3.260128050,LastTimestamp:2026-03-21 04:23:18.932341374 +0000 UTC m=+3.260128050,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.158036 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec0893f1cb199 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.932451737 +0000 UTC m=+3.260238413,LastTimestamp:2026-03-21 04:23:18.932451737 +0000 UTC m=+3.260238413,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.161097 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec0894037fb89 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.951017353 +0000 UTC m=+3.278804029,LastTimestamp:2026-03-21 04:23:18.951017353 +0000 UTC m=+3.278804029,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.164472 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec089404d8bf5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.952430581 +0000 UTC m=+3.280217257,LastTimestamp:2026-03-21 04:23:18.952430581 +0000 UTC m=+3.280217257,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.167908 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec0894083a707 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.955976455 +0000 UTC m=+3.283763131,LastTimestamp:2026-03-21 04:23:18.955976455 +0000 UTC m=+3.283763131,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.171203 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec08940a38979 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.958066041 +0000 UTC m=+3.285852707,LastTimestamp:2026-03-21 04:23:18.958066041 +0000 UTC m=+3.285852707,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.174408 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec0894a3bec8e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.119047822 +0000 UTC m=+3.446834528,LastTimestamp:2026-03-21 04:23:19.119047822 +0000 UTC m=+3.446834528,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.178181 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec0894a4e2004 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.120240644 +0000 UTC m=+3.448027310,LastTimestamp:2026-03-21 04:23:19.120240644 +0000 UTC m=+3.448027310,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.181806 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec0894afca1c0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.13167712 +0000 UTC m=+3.459463796,LastTimestamp:2026-03-21 04:23:19.13167712 +0000 UTC m=+3.459463796,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.185333 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec0894b1590a7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.133311143 +0000 UTC m=+3.461097819,LastTimestamp:2026-03-21 04:23:19.133311143 +0000 UTC m=+3.461097819,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.188768 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ec0894b2fef9d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.135039389 +0000 UTC m=+3.462826065,LastTimestamp:2026-03-21 04:23:19.135039389 +0000 UTC m=+3.462826065,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.191923 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08955d80194 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.313826196 +0000 UTC m=+3.641612872,LastTimestamp:2026-03-21 04:23:19.313826196 +0000 UTC m=+3.641612872,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.195489 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08956c6c930 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.329474864 +0000 UTC m=+3.657261540,LastTimestamp:2026-03-21 04:23:19.329474864 +0000 UTC m=+3.657261540,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.199490 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08956da05e4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.330735588 +0000 UTC m=+3.658522264,LastTimestamp:2026-03-21 04:23:19.330735588 +0000 UTC m=+3.658522264,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.204267 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089615dafaf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.507136431 +0000 UTC m=+3.834923107,LastTimestamp:2026-03-21 04:23:19.507136431 +0000 UTC m=+3.834923107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.208107 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec089642866eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.553976043 +0000 UTC m=+3.881762719,LastTimestamp:2026-03-21 04:23:19.553976043 +0000 UTC m=+3.881762719,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.211947 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec089660847fb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.585425403 +0000 UTC m=+3.913212079,LastTimestamp:2026-03-21 04:23:19.585425403 +0000 UTC m=+3.913212079,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.216074 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec0896f33bf8b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.739269003 +0000 UTC m=+4.067055679,LastTimestamp:2026-03-21 04:23:19.739269003 +0000 UTC m=+4.067055679,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.219693 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec08970343400 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.756076032 +0000 UTC m=+4.083862708,LastTimestamp:2026-03-21 04:23:19.756076032 +0000 UTC m=+4.083862708,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.223355 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec0899d7a9cab openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:20.515665067 +0000 UTC m=+4.843451733,LastTimestamp:2026-03-21 04:23:20.515665067 +0000 UTC m=+4.843451733,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.227939 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089ac3cfc52 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:20.763284562 +0000 UTC m=+5.091071238,LastTimestamp:2026-03-21 04:23:20.763284562 +0000 UTC m=+5.091071238,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.231359 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089ad08a3bd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:20.776631229 +0000 UTC m=+5.104417915,LastTimestamp:2026-03-21 04:23:20.776631229 +0000 UTC m=+5.104417915,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.234323 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089ad18a4e1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:20.777680097 +0000 UTC m=+5.105466793,LastTimestamp:2026-03-21 04:23:20.777680097 +0000 UTC m=+5.105466793,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.237936 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089bb509eeb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.016229611 +0000 UTC m=+5.344016287,LastTimestamp:2026-03-21 04:23:21.016229611 +0000 UTC m=+5.344016287,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.241145 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089bc25ec66 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.030208614 +0000 UTC m=+5.357995290,LastTimestamp:2026-03-21 04:23:21.030208614 +0000 UTC m=+5.357995290,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.244587 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089bc33e5e1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.031124449 +0000 UTC m=+5.358911125,LastTimestamp:2026-03-21 04:23:21.031124449 +0000 UTC m=+5.358911125,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.248439 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089c84d9d2b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.234136363 +0000 UTC m=+5.561923059,LastTimestamp:2026-03-21 04:23:21.234136363 +0000 UTC m=+5.561923059,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.251487 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089c9293439 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.248527417 +0000 UTC m=+5.576314103,LastTimestamp:2026-03-21 04:23:21.248527417 +0000 UTC m=+5.576314103,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.255334 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089c93b8b3b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.249729339 +0000 UTC m=+5.577516025,LastTimestamp:2026-03-21 04:23:21.249729339 +0000 UTC m=+5.577516025,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.259201 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089d6fa7bad openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.480346541 +0000 UTC m=+5.808133227,LastTimestamp:2026-03-21 04:23:21.480346541 +0000 UTC m=+5.808133227,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.262694 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089d79c7f00 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.490964224 +0000 UTC m=+5.818750910,LastTimestamp:2026-03-21 04:23:21.490964224 +0000 UTC m=+5.818750910,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.265821 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089d7ace029 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.492037673 +0000 UTC m=+5.819824359,LastTimestamp:2026-03-21 04:23:21.492037673 +0000 UTC m=+5.819824359,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.269879 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089e261670a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.671640842 +0000 UTC m=+5.999427508,LastTimestamp:2026-03-21 04:23:21.671640842 +0000 UTC m=+5.999427508,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.272856 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ec089e303b6d0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:21.682278096 +0000 UTC m=+6.010064762,LastTimestamp:2026-03-21 04:23:21.682278096 +0000 UTC m=+6.010064762,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.277728 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 04:24:11 crc kubenswrapper[4839]: &Event{ObjectMeta:{kube-controller-manager-crc.189ec08b2ef4d318 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 21 04:24:11 crc kubenswrapper[4839]: body: Mar 21 04:24:11 crc kubenswrapper[4839]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:27.251338008 +0000 UTC m=+11.579124694,LastTimestamp:2026-03-21 04:23:27.251338008 +0000 UTC m=+11.579124694,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:24:11 crc kubenswrapper[4839]: > Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.281092 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec08b2ef5d238 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:27.25140332 +0000 UTC m=+11.579190006,LastTimestamp:2026-03-21 04:23:27.25140332 +0000 UTC m=+11.579190006,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.284543 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 21 04:24:11 crc kubenswrapper[4839]: &Event{ObjectMeta:{kube-apiserver-crc.189ec08c086a7edc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 21 04:24:11 crc kubenswrapper[4839]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 21 04:24:11 crc kubenswrapper[4839]: Mar 21 04:24:11 crc kubenswrapper[4839]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:30.899705564 +0000 UTC m=+15.227492260,LastTimestamp:2026-03-21 04:23:30.899705564 +0000 UTC m=+15.227492260,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:24:11 crc kubenswrapper[4839]: > Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.287943 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08c086c41e3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:30.899821027 +0000 UTC m=+15.227607723,LastTimestamp:2026-03-21 04:23:30.899821027 +0000 UTC m=+15.227607723,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.291256 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ec08c086a7edc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 21 04:24:11 crc kubenswrapper[4839]: &Event{ObjectMeta:{kube-apiserver-crc.189ec08c086a7edc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 21 04:24:11 crc kubenswrapper[4839]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 21 04:24:11 crc kubenswrapper[4839]: Mar 21 04:24:11 crc kubenswrapper[4839]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:30.899705564 +0000 UTC m=+15.227492260,LastTimestamp:2026-03-21 04:23:30.904550433 +0000 UTC m=+15.232337119,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:24:11 crc kubenswrapper[4839]: > Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.294986 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ec08c086c41e3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08c086c41e3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:30.899821027 +0000 UTC m=+15.227607723,LastTimestamp:2026-03-21 04:23:30.904616535 +0000 UTC m=+15.232403221,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.298038 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 21 04:24:11 crc kubenswrapper[4839]: &Event{ObjectMeta:{kube-apiserver-crc.189ec08c245b404e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 500 Mar 21 04:24:11 crc kubenswrapper[4839]: body: [+]ping ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]log ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]etcd ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/generic-apiserver-start-informers ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/priority-and-fairness-filter ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/start-apiextensions-informers ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/start-apiextensions-controllers ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/crd-informer-synced ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/start-system-namespaces-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 21 04:24:11 crc kubenswrapper[4839]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 21 04:24:11 crc kubenswrapper[4839]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/bootstrap-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/start-kube-aggregator-informers ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/apiservice-registration-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/apiservice-discovery-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]autoregister-completion ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/apiservice-openapi-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 21 04:24:11 crc kubenswrapper[4839]: livez check failed Mar 21 04:24:11 crc kubenswrapper[4839]: Mar 21 04:24:11 crc kubenswrapper[4839]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:31.368468558 +0000 UTC m=+15.696255264,LastTimestamp:2026-03-21 04:23:31.368468558 +0000 UTC m=+15.696255264,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:24:11 crc kubenswrapper[4839]: > Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.300887 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08c245c0986 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:31.36852007 +0000 UTC m=+15.696306786,LastTimestamp:2026-03-21 04:23:31.36852007 +0000 UTC m=+15.696306786,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.304765 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ec08956da05e4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ec08956da05e4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:19.330735588 +0000 UTC m=+3.658522264,LastTimestamp:2026-03-21 04:23:31.572396107 +0000 UTC m=+15.900182783,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.309178 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 04:24:11 crc kubenswrapper[4839]: &Event{ObjectMeta:{kube-controller-manager-crc.189ec08d8308d163 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 21 04:24:11 crc kubenswrapper[4839]: body: Mar 21 04:24:11 crc kubenswrapper[4839]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:37.251869027 +0000 UTC m=+21.579655713,LastTimestamp:2026-03-21 04:23:37.251869027 +0000 UTC m=+21.579655713,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:24:11 crc kubenswrapper[4839]: > Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.312621 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec08d8309b48a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:37.251927178 +0000 UTC m=+21.579713874,LastTimestamp:2026-03-21 04:23:37.251927178 +0000 UTC m=+21.579713874,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.316780 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec08d8308d163\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 04:24:11 crc kubenswrapper[4839]: &Event{ObjectMeta:{kube-controller-manager-crc.189ec08d8308d163 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 21 04:24:11 crc kubenswrapper[4839]: body: Mar 21 04:24:11 crc kubenswrapper[4839]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:37.251869027 +0000 UTC m=+21.579655713,LastTimestamp:2026-03-21 04:23:47.251820604 +0000 UTC m=+31.579607300,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:24:11 crc kubenswrapper[4839]: > Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.319994 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec08d8309b48a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec08d8309b48a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:37.251927178 +0000 UTC m=+21.579713874,LastTimestamp:2026-03-21 04:23:47.251953719 +0000 UTC m=+31.579740415,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.323229 4839 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec08fd740778d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:47.254736781 +0000 UTC m=+31.582523487,LastTimestamp:2026-03-21 04:23:47.254736781 +0000 UTC m=+31.582523487,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.327050 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec088fcd112c0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec088fcd112c0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:17.820199616 +0000 UTC m=+2.147986292,LastTimestamp:2026-03-21 04:23:47.944168573 +0000 UTC m=+32.271955249,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.333657 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec0890e4a5fa6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec0890e4a5fa6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.11336183 +0000 UTC m=+2.441148506,LastTimestamp:2026-03-21 04:23:48.272318002 +0000 UTC m=+32.600104678,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.336919 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec0890f18c62d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec0890f18c62d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:18.126888493 +0000 UTC m=+2.454675189,LastTimestamp:2026-03-21 04:23:48.363119666 +0000 UTC m=+32.690906342,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.344702 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec08b2ef4d318\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 04:24:11 crc kubenswrapper[4839]: &Event{ObjectMeta:{kube-controller-manager-crc.189ec08b2ef4d318 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 21 04:24:11 crc kubenswrapper[4839]: body: Mar 21 04:24:11 crc kubenswrapper[4839]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:27.251338008 +0000 UTC m=+11.579124694,LastTimestamp:2026-03-21 04:23:57.251519488 +0000 UTC m=+41.579306204,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:24:11 crc kubenswrapper[4839]: > Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.348372 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec08b2ef5d238\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ec08b2ef5d238 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:27.25140332 +0000 UTC m=+11.579190006,LastTimestamp:2026-03-21 04:23:57.251648062 +0000 UTC m=+41.579434768,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:24:11 crc kubenswrapper[4839]: E0321 04:24:11.352166 4839 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ec08d8308d163\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 21 04:24:11 crc kubenswrapper[4839]: &Event{ObjectMeta:{kube-controller-manager-crc.189ec08d8308d163 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 21 04:24:11 crc kubenswrapper[4839]: body: Mar 21 04:24:11 crc kubenswrapper[4839]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:23:37.251869027 +0000 UTC m=+21.579655713,LastTimestamp:2026-03-21 04:24:07.251052796 +0000 UTC m=+51.578839512,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 21 04:24:11 crc kubenswrapper[4839]: > Mar 21 04:24:11 crc kubenswrapper[4839]: I0321 04:24:11.398292 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:12 crc kubenswrapper[4839]: I0321 04:24:12.316630 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:12 crc kubenswrapper[4839]: I0321 04:24:12.318850 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:12 crc kubenswrapper[4839]: I0321 04:24:12.318878 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:12 crc kubenswrapper[4839]: I0321 04:24:12.318887 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:12 crc kubenswrapper[4839]: I0321 04:24:12.318933 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:24:12 crc kubenswrapper[4839]: E0321 04:24:12.320505 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 04:24:12 crc kubenswrapper[4839]: E0321 04:24:12.320989 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 04:24:12 crc kubenswrapper[4839]: I0321 04:24:12.400020 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.401900 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.452802 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.454339 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.454422 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.454443 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.455488 4839 scope.go:117] "RemoveContainer" containerID="2edb841c97e50e44d1c47b38425979e96688c3d536fe263be8db34fe4f7ec6ce" Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.728941 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.731897 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66"} Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.732271 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.733870 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.733907 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:13 crc kubenswrapper[4839]: I0321 04:24:13.733919 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:14 crc kubenswrapper[4839]: I0321 04:24:14.399835 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.399341 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.739012 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.739766 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.741425 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66" exitCode=255 Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.741460 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66"} Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.741497 4839 scope.go:117] "RemoveContainer" containerID="2edb841c97e50e44d1c47b38425979e96688c3d536fe263be8db34fe4f7ec6ce" Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.741679 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.742488 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.742516 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.742524 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:15 crc kubenswrapper[4839]: I0321 04:24:15.749188 4839 scope.go:117] "RemoveContainer" containerID="ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66" Mar 21 04:24:15 crc kubenswrapper[4839]: E0321 04:24:15.750465 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:24:16 crc kubenswrapper[4839]: I0321 04:24:16.398749 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:16 crc kubenswrapper[4839]: E0321 04:24:16.520472 4839 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:24:16 crc kubenswrapper[4839]: I0321 04:24:16.746384 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.251290 4839 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.251351 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.251401 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.251526 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.253434 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.253461 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.253475 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.254054 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.254139 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505" gracePeriod=30 Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.399904 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.753444 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.754545 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.754923 4839 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505" exitCode=255 Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.754967 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505"} Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.755000 4839 scope.go:117] "RemoveContainer" containerID="1e036a62b31123ade40717ddff4b8b13971bdcda78062ebf348d49978c3c7a58" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.770406 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.770580 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.771525 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.771602 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.771619 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:17 crc kubenswrapper[4839]: I0321 04:24:17.772226 4839 scope.go:117] "RemoveContainer" containerID="ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66" Mar 21 04:24:17 crc kubenswrapper[4839]: E0321 04:24:17.772425 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:24:18 crc kubenswrapper[4839]: I0321 04:24:18.400304 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:18 crc kubenswrapper[4839]: I0321 04:24:18.759626 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 21 04:24:18 crc kubenswrapper[4839]: I0321 04:24:18.761086 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065"} Mar 21 04:24:18 crc kubenswrapper[4839]: I0321 04:24:18.761269 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:18 crc kubenswrapper[4839]: I0321 04:24:18.762405 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:18 crc kubenswrapper[4839]: I0321 04:24:18.762439 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:18 crc kubenswrapper[4839]: I0321 04:24:18.762454 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:19 crc kubenswrapper[4839]: I0321 04:24:19.320650 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:19 crc kubenswrapper[4839]: I0321 04:24:19.321756 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:19 crc kubenswrapper[4839]: I0321 04:24:19.321786 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:19 crc kubenswrapper[4839]: I0321 04:24:19.321797 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:19 crc kubenswrapper[4839]: I0321 04:24:19.321818 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:24:19 crc kubenswrapper[4839]: E0321 04:24:19.324560 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 04:24:19 crc kubenswrapper[4839]: E0321 04:24:19.325424 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 04:24:19 crc kubenswrapper[4839]: I0321 04:24:19.395633 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:19 crc kubenswrapper[4839]: I0321 04:24:19.765286 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:19 crc kubenswrapper[4839]: I0321 04:24:19.766188 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:19 crc kubenswrapper[4839]: I0321 04:24:19.766270 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:19 crc kubenswrapper[4839]: I0321 04:24:19.766296 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:20 crc kubenswrapper[4839]: I0321 04:24:20.411590 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:21 crc kubenswrapper[4839]: I0321 04:24:21.402037 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:22 crc kubenswrapper[4839]: I0321 04:24:22.397910 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.398651 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.678703 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.678985 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.680734 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.680773 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.680782 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.681308 4839 scope.go:117] "RemoveContainer" containerID="ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66" Mar 21 04:24:23 crc kubenswrapper[4839]: E0321 04:24:23.681455 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.923908 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.924071 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.925184 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.925228 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:23 crc kubenswrapper[4839]: I0321 04:24:23.925238 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:24 crc kubenswrapper[4839]: I0321 04:24:24.250482 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:24:24 crc kubenswrapper[4839]: I0321 04:24:24.254131 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:24:24 crc kubenswrapper[4839]: I0321 04:24:24.397987 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:24 crc kubenswrapper[4839]: I0321 04:24:24.774620 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:24 crc kubenswrapper[4839]: I0321 04:24:24.775440 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:24 crc kubenswrapper[4839]: I0321 04:24:24.775500 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:24 crc kubenswrapper[4839]: I0321 04:24:24.775511 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:25 crc kubenswrapper[4839]: I0321 04:24:25.398864 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:25 crc kubenswrapper[4839]: I0321 04:24:25.776537 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:25 crc kubenswrapper[4839]: I0321 04:24:25.777383 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:25 crc kubenswrapper[4839]: I0321 04:24:25.777421 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:25 crc kubenswrapper[4839]: I0321 04:24:25.777434 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:26 crc kubenswrapper[4839]: I0321 04:24:26.325091 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:26 crc kubenswrapper[4839]: I0321 04:24:26.326222 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:26 crc kubenswrapper[4839]: I0321 04:24:26.326254 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:26 crc kubenswrapper[4839]: I0321 04:24:26.326266 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:26 crc kubenswrapper[4839]: I0321 04:24:26.326291 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:24:26 crc kubenswrapper[4839]: E0321 04:24:26.329907 4839 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 21 04:24:26 crc kubenswrapper[4839]: E0321 04:24:26.330094 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 21 04:24:26 crc kubenswrapper[4839]: I0321 04:24:26.397930 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:26 crc kubenswrapper[4839]: E0321 04:24:26.520953 4839 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:24:27 crc kubenswrapper[4839]: I0321 04:24:27.185120 4839 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 21 04:24:27 crc kubenswrapper[4839]: I0321 04:24:27.198044 4839 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 21 04:24:27 crc kubenswrapper[4839]: I0321 04:24:27.399814 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:28 crc kubenswrapper[4839]: I0321 04:24:28.399810 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:29 crc kubenswrapper[4839]: I0321 04:24:29.398854 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:30 crc kubenswrapper[4839]: I0321 04:24:30.398312 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:31 crc kubenswrapper[4839]: I0321 04:24:31.399493 4839 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 21 04:24:31 crc kubenswrapper[4839]: I0321 04:24:31.761548 4839 csr.go:261] certificate signing request csr-t6bsz is approved, waiting to be issued Mar 21 04:24:31 crc kubenswrapper[4839]: I0321 04:24:31.780867 4839 csr.go:257] certificate signing request csr-t6bsz is issued Mar 21 04:24:31 crc kubenswrapper[4839]: I0321 04:24:31.836816 4839 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 21 04:24:32 crc kubenswrapper[4839]: I0321 04:24:32.187011 4839 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 21 04:24:32 crc kubenswrapper[4839]: I0321 04:24:32.782104 4839 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-16 20:32:32.914753866 +0000 UTC Mar 21 04:24:32 crc kubenswrapper[4839]: I0321 04:24:32.782153 4839 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6496h8m0.132603148s for next certificate rotation Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.330440 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.331492 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.331528 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.331537 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.331647 4839 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.340028 4839 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.340109 4839 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.340127 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.342841 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.342963 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.342989 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.343006 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.343018 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:33Z","lastTransitionTime":"2026-03-21T04:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.357180 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.363555 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.363612 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.363623 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.363637 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.363646 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:33Z","lastTransitionTime":"2026-03-21T04:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.373896 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.380056 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.380107 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.380117 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.380131 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.380140 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:33Z","lastTransitionTime":"2026-03-21T04:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.388923 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.397513 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.397782 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.397797 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.397815 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.397832 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:33Z","lastTransitionTime":"2026-03-21T04:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.409392 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.409498 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.409520 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.510677 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.611347 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.712345 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.813162 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:33 crc kubenswrapper[4839]: E0321 04:24:33.913410 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.927492 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.927711 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.929021 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.929054 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:33 crc kubenswrapper[4839]: I0321 04:24:33.929063 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:34 crc kubenswrapper[4839]: E0321 04:24:34.013514 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:34 crc kubenswrapper[4839]: E0321 04:24:34.114030 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:34 crc kubenswrapper[4839]: E0321 04:24:34.215179 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:34 crc kubenswrapper[4839]: E0321 04:24:34.315745 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:34 crc kubenswrapper[4839]: E0321 04:24:34.416914 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:34 crc kubenswrapper[4839]: I0321 04:24:34.451931 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:34 crc kubenswrapper[4839]: I0321 04:24:34.453624 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:34 crc kubenswrapper[4839]: I0321 04:24:34.453675 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:34 crc kubenswrapper[4839]: I0321 04:24:34.453687 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:34 crc kubenswrapper[4839]: E0321 04:24:34.518071 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:34 crc kubenswrapper[4839]: E0321 04:24:34.618981 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:34 crc kubenswrapper[4839]: E0321 04:24:34.720011 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:34 crc kubenswrapper[4839]: E0321 04:24:34.821186 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:34 crc kubenswrapper[4839]: E0321 04:24:34.922615 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.023738 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.124645 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.225767 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.326428 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.426818 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:35 crc kubenswrapper[4839]: I0321 04:24:35.452066 4839 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 21 04:24:35 crc kubenswrapper[4839]: I0321 04:24:35.453169 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:35 crc kubenswrapper[4839]: I0321 04:24:35.453195 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:35 crc kubenswrapper[4839]: I0321 04:24:35.453205 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:35 crc kubenswrapper[4839]: I0321 04:24:35.453794 4839 scope.go:117] "RemoveContainer" containerID="ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66" Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.453966 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.527356 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:35 crc kubenswrapper[4839]: I0321 04:24:35.535276 4839 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.628379 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.729443 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.830392 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:35 crc kubenswrapper[4839]: E0321 04:24:35.931048 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.031332 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.132162 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.232875 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.333467 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.434354 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.522049 4839 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.534442 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.634911 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.735987 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.836551 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:36 crc kubenswrapper[4839]: E0321 04:24:36.937257 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:37 crc kubenswrapper[4839]: E0321 04:24:37.037399 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:37 crc kubenswrapper[4839]: E0321 04:24:37.137944 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:37 crc kubenswrapper[4839]: E0321 04:24:37.238916 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:37 crc kubenswrapper[4839]: E0321 04:24:37.339794 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:37 crc kubenswrapper[4839]: E0321 04:24:37.440148 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:37 crc kubenswrapper[4839]: E0321 04:24:37.541134 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:37 crc kubenswrapper[4839]: E0321 04:24:37.642214 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:37 crc kubenswrapper[4839]: E0321 04:24:37.742346 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:37 crc kubenswrapper[4839]: E0321 04:24:37.843364 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:37 crc kubenswrapper[4839]: E0321 04:24:37.943507 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:38 crc kubenswrapper[4839]: E0321 04:24:38.043657 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:38 crc kubenswrapper[4839]: E0321 04:24:38.144805 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:38 crc kubenswrapper[4839]: E0321 04:24:38.245024 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:38 crc kubenswrapper[4839]: E0321 04:24:38.345484 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:38 crc kubenswrapper[4839]: E0321 04:24:38.446169 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:38 crc kubenswrapper[4839]: E0321 04:24:38.546366 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:38 crc kubenswrapper[4839]: E0321 04:24:38.646558 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:38 crc kubenswrapper[4839]: E0321 04:24:38.747212 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:38 crc kubenswrapper[4839]: E0321 04:24:38.847937 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:38 crc kubenswrapper[4839]: E0321 04:24:38.948616 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:39 crc kubenswrapper[4839]: E0321 04:24:39.049740 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:39 crc kubenswrapper[4839]: E0321 04:24:39.150652 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:39 crc kubenswrapper[4839]: E0321 04:24:39.251090 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:39 crc kubenswrapper[4839]: E0321 04:24:39.351563 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:39 crc kubenswrapper[4839]: E0321 04:24:39.452628 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:39 crc kubenswrapper[4839]: E0321 04:24:39.553730 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:39 crc kubenswrapper[4839]: E0321 04:24:39.654910 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:39 crc kubenswrapper[4839]: E0321 04:24:39.755263 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:39 crc kubenswrapper[4839]: E0321 04:24:39.855595 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:39 crc kubenswrapper[4839]: E0321 04:24:39.955961 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:40 crc kubenswrapper[4839]: E0321 04:24:40.056363 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:40 crc kubenswrapper[4839]: E0321 04:24:40.157537 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:40 crc kubenswrapper[4839]: E0321 04:24:40.258144 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:40 crc kubenswrapper[4839]: E0321 04:24:40.358801 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:40 crc kubenswrapper[4839]: E0321 04:24:40.459466 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:40 crc kubenswrapper[4839]: E0321 04:24:40.560189 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:40 crc kubenswrapper[4839]: E0321 04:24:40.661000 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:40 crc kubenswrapper[4839]: E0321 04:24:40.761690 4839 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.800615 4839 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.865124 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.865176 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.865193 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.865217 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.865232 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:40Z","lastTransitionTime":"2026-03-21T04:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.968605 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.968654 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.968665 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.968683 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:40 crc kubenswrapper[4839]: I0321 04:24:40.968695 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:40Z","lastTransitionTime":"2026-03-21T04:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.071070 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.071150 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.071169 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.071198 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.071219 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:41Z","lastTransitionTime":"2026-03-21T04:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.174023 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.174103 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.174126 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.174153 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.174171 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:41Z","lastTransitionTime":"2026-03-21T04:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.277143 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.277200 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.277214 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.277232 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.277244 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:41Z","lastTransitionTime":"2026-03-21T04:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.381140 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.381192 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.381204 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.381223 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.381237 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:41Z","lastTransitionTime":"2026-03-21T04:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.420540 4839 apiserver.go:52] "Watching apiserver" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.427407 4839 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.428021 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.428642 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.428717 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.428784 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.429261 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.429279 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.429647 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.429656 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.429980 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.430030 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.433416 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.433636 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.433443 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.434544 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.434689 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.434697 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.433478 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.435055 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.435263 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.477040 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.484835 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.484891 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.484902 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.484920 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.484930 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:41Z","lastTransitionTime":"2026-03-21T04:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.493339 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.495982 4839 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.499851 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.500001 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.500109 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.500202 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.500298 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.500394 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.500486 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.500697 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.500813 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.500910 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.501013 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.501109 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.501035 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.501044 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.501136 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:24:42.001118281 +0000 UTC m=+86.328904957 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.502477 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.501366 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.501557 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.501773 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.501889 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.502132 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.502190 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.502898 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503021 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503134 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503230 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503333 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503432 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503614 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503437 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503706 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503738 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503855 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503901 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503895 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503918 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503941 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.503979 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504019 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504052 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504091 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504096 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504128 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504172 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504208 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504210 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504248 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504289 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504325 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504368 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504407 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504445 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504454 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504481 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504522 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504557 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504627 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504659 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504693 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504726 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504757 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504788 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504821 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504852 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504881 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504888 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504953 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504985 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505018 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505048 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505041 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505065 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505181 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505205 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505228 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505245 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505261 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505280 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505298 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505317 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505337 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505351 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505368 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505388 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505410 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505428 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505444 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505460 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505477 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505496 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505513 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505538 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505555 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505591 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505609 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505627 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505645 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505668 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505685 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505702 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505718 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505736 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505751 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505769 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505789 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505807 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505825 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505844 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505901 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505918 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505934 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505953 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505969 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505986 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506006 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506031 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506054 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506080 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506108 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506133 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506163 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506187 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506212 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506241 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506266 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506292 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.504995 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505032 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505147 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505155 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505302 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506389 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505509 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505536 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505639 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.505760 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506054 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506062 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506084 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506127 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506380 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506681 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506693 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506854 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506877 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506810 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.507178 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.507512 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.507537 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.507716 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.507982 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.507998 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.508035 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.508356 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.508559 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.508836 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.508834 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.508965 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509024 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509289 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.506321 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509345 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509370 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509374 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509389 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509493 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509695 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509792 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509873 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509919 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509965 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510007 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510046 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510090 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510131 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510170 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510209 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510243 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510279 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510365 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510407 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.511670 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.511719 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512242 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512280 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512312 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512343 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512371 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512398 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512423 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512458 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512485 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512511 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512539 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512586 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512611 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512642 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512667 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512696 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512721 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512748 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512775 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512810 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512838 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512869 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512899 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512927 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512954 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512978 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.513003 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.513027 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.513053 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.513078 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.513101 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.513127 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509420 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509738 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509764 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509789 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509890 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509961 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.509980 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510469 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.510810 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.511079 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.511178 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.511173 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.511265 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.511375 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.511598 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.511665 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.511832 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512024 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512089 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512108 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512105 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512128 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.512327 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.515778 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.516833 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.516880 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.516975 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.517083 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.517119 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.517356 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.517714 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518067 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518080 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518106 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518167 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518296 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518365 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518462 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518498 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518517 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518131 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518666 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518805 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.518844 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519001 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.513154 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519388 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519505 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519536 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519592 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519620 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519647 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519676 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519702 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519728 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519747 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519756 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519800 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519825 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519847 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519867 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519887 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519907 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519927 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519945 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519963 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.519982 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520075 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520097 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520149 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520169 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520189 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520210 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520229 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520249 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520270 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520290 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520315 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520332 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520355 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520353 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520376 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520400 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520460 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520502 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520522 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520548 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520590 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520617 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520653 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520675 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520697 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520722 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520743 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520767 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520789 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520815 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520906 4839 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520920 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520932 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520945 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520956 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520967 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520978 4839 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520989 4839 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521000 4839 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521010 4839 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521020 4839 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521031 4839 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521041 4839 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521051 4839 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521060 4839 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521071 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521080 4839 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521091 4839 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521107 4839 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521117 4839 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521127 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521137 4839 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521147 4839 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521159 4839 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521168 4839 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521177 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521187 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521196 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521205 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521215 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521226 4839 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521236 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521247 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521256 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521265 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521274 4839 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521284 4839 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521293 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526139 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526169 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526195 4839 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526216 4839 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526239 4839 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526266 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526303 4839 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526329 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526351 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526372 4839 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526394 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526413 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526433 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526453 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526472 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526491 4839 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526510 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526528 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526555 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526606 4839 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526631 4839 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526651 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526671 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526692 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526712 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526735 4839 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526757 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526777 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526797 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526815 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526859 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526881 4839 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526900 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526920 4839 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526964 4839 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526982 4839 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527001 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527020 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527042 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527063 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527082 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527101 4839 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527120 4839 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527139 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527162 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527180 4839 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527200 4839 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527222 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527240 4839 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527266 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527287 4839 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527306 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527325 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527348 4839 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527366 4839 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527383 4839 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527401 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527421 4839 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527438 4839 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527458 4839 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520512 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527477 4839 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520934 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.520956 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521114 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521290 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521272 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521433 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521694 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521800 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521848 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527843 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527498 4839 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.521900 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.522064 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.522096 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.522133 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.528046 4839 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.522216 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.522245 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.522519 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.528005 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.528106 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:42.028089872 +0000 UTC m=+86.355876548 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.528417 4839 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.528593 4839 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.528637 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:42.028605018 +0000 UTC m=+86.356391734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.523298 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.523438 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.523492 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.523530 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.523687 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.524339 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.524387 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.525056 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.525133 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.525300 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.525343 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.525384 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.525778 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.525737 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526061 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526246 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526363 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526614 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526644 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526672 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.526704 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527030 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527222 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527265 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.527442 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.522557 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.529134 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.529340 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.529787 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.529853 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.530420 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.530417 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.532403 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.532917 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.533174 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.533272 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.533314 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.533721 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.534277 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.534714 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.534749 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.534816 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.535070 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.537554 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.539691 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.539893 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.541367 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.541636 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.541819 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.543304 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.543316 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.543490 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.544649 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.544719 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.544843 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.546274 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.546480 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.547088 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.547200 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.547261 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.547338 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.547595 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.548058 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.548297 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.548905 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.549034 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.549289 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.549894 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.551898 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.553727 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.554154 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.554878 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.555054 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.555241 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.555733 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.555761 4839 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.555853 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:42.055822995 +0000 UTC m=+86.383609671 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.555905 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.555367 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.556001 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.556018 4839 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.556066 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:42.056051762 +0000 UTC m=+86.383838458 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.555457 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.556889 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.557542 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.557742 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.557881 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.558075 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.558159 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.571018 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.571656 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.576307 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.587708 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.587850 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.587933 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.588036 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.588130 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:41Z","lastTransitionTime":"2026-03-21T04:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.592437 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.593945 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628673 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628763 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628830 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628847 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628863 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628876 4839 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628890 4839 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628903 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628917 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628931 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628946 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628960 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628973 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.628987 4839 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629000 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629014 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629027 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629045 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629059 4839 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629072 4839 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629087 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629100 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629113 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629127 4839 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629140 4839 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629152 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629169 4839 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629183 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629199 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629212 4839 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629230 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629242 4839 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629255 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629268 4839 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629283 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629297 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629310 4839 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629323 4839 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629340 4839 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629353 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629366 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629381 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629397 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629410 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629423 4839 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629435 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629448 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629460 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629473 4839 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629485 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629498 4839 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629511 4839 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629524 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629537 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629554 4839 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629592 4839 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629609 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629625 4839 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629642 4839 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629656 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629675 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629694 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629711 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629727 4839 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629745 4839 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629763 4839 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629781 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629798 4839 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629814 4839 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629831 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629848 4839 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629865 4839 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629863 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629882 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629944 4839 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629930 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.629964 4839 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630082 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630107 4839 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630128 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630147 4839 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630170 4839 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630191 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630213 4839 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630233 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630257 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630280 4839 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630301 4839 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630323 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630342 4839 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630361 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630383 4839 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630403 4839 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630424 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630444 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630465 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630485 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.630504 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.690551 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.690668 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.690688 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.690705 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.690715 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:41Z","lastTransitionTime":"2026-03-21T04:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.746150 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.755520 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.763752 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.777911 4839 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:24:41 crc kubenswrapper[4839]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 21 04:24:41 crc kubenswrapper[4839]: set -o allexport Mar 21 04:24:41 crc kubenswrapper[4839]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 21 04:24:41 crc kubenswrapper[4839]: source /etc/kubernetes/apiserver-url.env Mar 21 04:24:41 crc kubenswrapper[4839]: else Mar 21 04:24:41 crc kubenswrapper[4839]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 21 04:24:41 crc kubenswrapper[4839]: exit 1 Mar 21 04:24:41 crc kubenswrapper[4839]: fi Mar 21 04:24:41 crc kubenswrapper[4839]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 21 04:24:41 crc kubenswrapper[4839]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:24:41 crc kubenswrapper[4839]: > logger="UnhandledError" Mar 21 04:24:41 crc kubenswrapper[4839]: W0321 04:24:41.778412 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-99a38806bf8af67d7dcd06a36784129e6b764322bb3536a7eb3fece336067d6b WatchSource:0}: Error finding container 99a38806bf8af67d7dcd06a36784129e6b764322bb3536a7eb3fece336067d6b: Status 404 returned error can't find the container with id 99a38806bf8af67d7dcd06a36784129e6b764322bb3536a7eb3fece336067d6b Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.779528 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.781698 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.782941 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 21 04:24:41 crc kubenswrapper[4839]: W0321 04:24:41.785649 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-72b98d76e39f4cbef24cdca436ed49737d850e0bb69ce99ba733a15e25210046 WatchSource:0}: Error finding container 72b98d76e39f4cbef24cdca436ed49737d850e0bb69ce99ba733a15e25210046: Status 404 returned error can't find the container with id 72b98d76e39f4cbef24cdca436ed49737d850e0bb69ce99ba733a15e25210046 Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.789158 4839 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:24:41 crc kubenswrapper[4839]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 21 04:24:41 crc kubenswrapper[4839]: if [[ -f "/env/_master" ]]; then Mar 21 04:24:41 crc kubenswrapper[4839]: set -o allexport Mar 21 04:24:41 crc kubenswrapper[4839]: source "/env/_master" Mar 21 04:24:41 crc kubenswrapper[4839]: set +o allexport Mar 21 04:24:41 crc kubenswrapper[4839]: fi Mar 21 04:24:41 crc kubenswrapper[4839]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 21 04:24:41 crc kubenswrapper[4839]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 21 04:24:41 crc kubenswrapper[4839]: ho_enable="--enable-hybrid-overlay" Mar 21 04:24:41 crc kubenswrapper[4839]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 21 04:24:41 crc kubenswrapper[4839]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 21 04:24:41 crc kubenswrapper[4839]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 21 04:24:41 crc kubenswrapper[4839]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 21 04:24:41 crc kubenswrapper[4839]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 21 04:24:41 crc kubenswrapper[4839]: --webhook-host=127.0.0.1 \ Mar 21 04:24:41 crc kubenswrapper[4839]: --webhook-port=9743 \ Mar 21 04:24:41 crc kubenswrapper[4839]: ${ho_enable} \ Mar 21 04:24:41 crc kubenswrapper[4839]: --enable-interconnect \ Mar 21 04:24:41 crc kubenswrapper[4839]: --disable-approver \ Mar 21 04:24:41 crc kubenswrapper[4839]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 21 04:24:41 crc kubenswrapper[4839]: --wait-for-kubernetes-api=200s \ Mar 21 04:24:41 crc kubenswrapper[4839]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 21 04:24:41 crc kubenswrapper[4839]: --loglevel="${LOGLEVEL}" Mar 21 04:24:41 crc kubenswrapper[4839]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:24:41 crc kubenswrapper[4839]: > logger="UnhandledError" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.791636 4839 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:24:41 crc kubenswrapper[4839]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 21 04:24:41 crc kubenswrapper[4839]: if [[ -f "/env/_master" ]]; then Mar 21 04:24:41 crc kubenswrapper[4839]: set -o allexport Mar 21 04:24:41 crc kubenswrapper[4839]: source "/env/_master" Mar 21 04:24:41 crc kubenswrapper[4839]: set +o allexport Mar 21 04:24:41 crc kubenswrapper[4839]: fi Mar 21 04:24:41 crc kubenswrapper[4839]: Mar 21 04:24:41 crc kubenswrapper[4839]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 21 04:24:41 crc kubenswrapper[4839]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 21 04:24:41 crc kubenswrapper[4839]: --disable-webhook \ Mar 21 04:24:41 crc kubenswrapper[4839]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 21 04:24:41 crc kubenswrapper[4839]: --loglevel="${LOGLEVEL}" Mar 21 04:24:41 crc kubenswrapper[4839]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:24:41 crc kubenswrapper[4839]: > logger="UnhandledError" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.792781 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.792779 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.792853 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.792878 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.792909 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.792930 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:41Z","lastTransitionTime":"2026-03-21T04:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.895994 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.896067 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.896106 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.896123 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.896134 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:41Z","lastTransitionTime":"2026-03-21T04:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.943354 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"99a38806bf8af67d7dcd06a36784129e6b764322bb3536a7eb3fece336067d6b"} Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.945319 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1f70681832960db1114ce205a81f59091acc07b1d7930b16e5ad95e81eedc8e5"} Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.945942 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.947103 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.947111 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"72b98d76e39f4cbef24cdca436ed49737d850e0bb69ce99ba733a15e25210046"} Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.947151 4839 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:24:41 crc kubenswrapper[4839]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 21 04:24:41 crc kubenswrapper[4839]: set -o allexport Mar 21 04:24:41 crc kubenswrapper[4839]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 21 04:24:41 crc kubenswrapper[4839]: source /etc/kubernetes/apiserver-url.env Mar 21 04:24:41 crc kubenswrapper[4839]: else Mar 21 04:24:41 crc kubenswrapper[4839]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 21 04:24:41 crc kubenswrapper[4839]: exit 1 Mar 21 04:24:41 crc kubenswrapper[4839]: fi Mar 21 04:24:41 crc kubenswrapper[4839]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 21 04:24:41 crc kubenswrapper[4839]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:24:41 crc kubenswrapper[4839]: > logger="UnhandledError" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.948247 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.948383 4839 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:24:41 crc kubenswrapper[4839]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 21 04:24:41 crc kubenswrapper[4839]: if [[ -f "/env/_master" ]]; then Mar 21 04:24:41 crc kubenswrapper[4839]: set -o allexport Mar 21 04:24:41 crc kubenswrapper[4839]: source "/env/_master" Mar 21 04:24:41 crc kubenswrapper[4839]: set +o allexport Mar 21 04:24:41 crc kubenswrapper[4839]: fi Mar 21 04:24:41 crc kubenswrapper[4839]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 21 04:24:41 crc kubenswrapper[4839]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 21 04:24:41 crc kubenswrapper[4839]: ho_enable="--enable-hybrid-overlay" Mar 21 04:24:41 crc kubenswrapper[4839]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 21 04:24:41 crc kubenswrapper[4839]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 21 04:24:41 crc kubenswrapper[4839]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 21 04:24:41 crc kubenswrapper[4839]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 21 04:24:41 crc kubenswrapper[4839]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 21 04:24:41 crc kubenswrapper[4839]: --webhook-host=127.0.0.1 \ Mar 21 04:24:41 crc kubenswrapper[4839]: --webhook-port=9743 \ Mar 21 04:24:41 crc kubenswrapper[4839]: ${ho_enable} \ Mar 21 04:24:41 crc kubenswrapper[4839]: --enable-interconnect \ Mar 21 04:24:41 crc kubenswrapper[4839]: --disable-approver \ Mar 21 04:24:41 crc kubenswrapper[4839]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 21 04:24:41 crc kubenswrapper[4839]: --wait-for-kubernetes-api=200s \ Mar 21 04:24:41 crc kubenswrapper[4839]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 21 04:24:41 crc kubenswrapper[4839]: --loglevel="${LOGLEVEL}" Mar 21 04:24:41 crc kubenswrapper[4839]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:24:41 crc kubenswrapper[4839]: > logger="UnhandledError" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.950367 4839 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:24:41 crc kubenswrapper[4839]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 21 04:24:41 crc kubenswrapper[4839]: if [[ -f "/env/_master" ]]; then Mar 21 04:24:41 crc kubenswrapper[4839]: set -o allexport Mar 21 04:24:41 crc kubenswrapper[4839]: source "/env/_master" Mar 21 04:24:41 crc kubenswrapper[4839]: set +o allexport Mar 21 04:24:41 crc kubenswrapper[4839]: fi Mar 21 04:24:41 crc kubenswrapper[4839]: Mar 21 04:24:41 crc kubenswrapper[4839]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 21 04:24:41 crc kubenswrapper[4839]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 21 04:24:41 crc kubenswrapper[4839]: --disable-webhook \ Mar 21 04:24:41 crc kubenswrapper[4839]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 21 04:24:41 crc kubenswrapper[4839]: --loglevel="${LOGLEVEL}" Mar 21 04:24:41 crc kubenswrapper[4839]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:24:41 crc kubenswrapper[4839]: > logger="UnhandledError" Mar 21 04:24:41 crc kubenswrapper[4839]: E0321 04:24:41.951642 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.955982 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.971419 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.983247 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.992537 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.998182 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.998224 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.998243 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.998261 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:41 crc kubenswrapper[4839]: I0321 04:24:41.998273 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:41Z","lastTransitionTime":"2026-03-21T04:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.001600 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.012001 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.021001 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.030891 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.033455 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.033668 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:24:43.033624184 +0000 UTC m=+87.361411000 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.033800 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.033933 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.033969 4839 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.034021 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:43.034010196 +0000 UTC m=+87.361796872 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.034074 4839 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.034113 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:43.034100979 +0000 UTC m=+87.361887865 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.040647 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.049941 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.058366 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.069995 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.100765 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.100810 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.100821 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.100839 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.100852 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:42Z","lastTransitionTime":"2026-03-21T04:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.135433 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.135479 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.135648 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.135665 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.135676 4839 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.135712 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.135781 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.135799 4839 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.135728 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:43.135713147 +0000 UTC m=+87.463499823 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:42 crc kubenswrapper[4839]: E0321 04:24:42.135903 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:43.135878052 +0000 UTC m=+87.463664748 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.205550 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.205615 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.205625 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.205643 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.205658 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:42Z","lastTransitionTime":"2026-03-21T04:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.308055 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.308097 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.308108 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.308124 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.308135 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:42Z","lastTransitionTime":"2026-03-21T04:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.409967 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.410029 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.410050 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.410074 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.410088 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:42Z","lastTransitionTime":"2026-03-21T04:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.456630 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.457118 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.457939 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.458633 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.459283 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.459868 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.460476 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.461103 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.461757 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.462252 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.462771 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.463421 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.463898 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.464420 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.464930 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.465406 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.466008 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.466443 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.468416 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.469721 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.471171 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.472244 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.473188 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.475107 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.476014 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.477666 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.478522 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.480019 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.481435 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.482844 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.483516 4839 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.483676 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.486671 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.487391 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.487958 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.490194 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.491646 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.492404 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.493850 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.495181 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.496753 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.498096 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.499876 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.500857 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.502146 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.502920 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.504104 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.505198 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.506394 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.507095 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.508293 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.509011 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.509825 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.511017 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.512270 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.512297 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.512310 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.512324 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.512336 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:42Z","lastTransitionTime":"2026-03-21T04:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.614305 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.614348 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.614359 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.614371 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.614381 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:42Z","lastTransitionTime":"2026-03-21T04:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.716088 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.716130 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.716142 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.716159 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.716169 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:42Z","lastTransitionTime":"2026-03-21T04:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.818415 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.818449 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.818456 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.818469 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.818477 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:42Z","lastTransitionTime":"2026-03-21T04:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.920427 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.920456 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.920463 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.920475 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:42 crc kubenswrapper[4839]: I0321 04:24:42.920484 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:42Z","lastTransitionTime":"2026-03-21T04:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.022218 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.022258 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.022269 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.022283 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.022291 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.041646 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.041713 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.041753 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.041779 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:24:45.04175826 +0000 UTC m=+89.369544936 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.041836 4839 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.041840 4839 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.041877 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:45.041869103 +0000 UTC m=+89.369655769 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.041889 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:45.041883954 +0000 UTC m=+89.369670630 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.124925 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.124964 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.124974 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.124990 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.125000 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.142266 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.142323 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.142427 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.142437 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.142476 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.142489 4839 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.142540 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:45.142521692 +0000 UTC m=+89.470308368 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.142443 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.142922 4839 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.143035 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:45.143020437 +0000 UTC m=+89.470807173 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.227094 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.227127 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.227136 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.227148 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.227156 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.329356 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.329399 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.329409 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.329425 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.329437 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.431952 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.431994 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.432006 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.432022 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.432033 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.451848 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.451869 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.451861 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.451978 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.452115 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.452230 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.534361 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.534401 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.534411 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.534443 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.534454 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.637831 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.638139 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.638225 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.638345 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.638450 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.739321 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.739370 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.739380 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.739395 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.739405 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.748939 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.751793 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.751826 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.751835 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.751848 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.751856 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.759743 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.763639 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.763668 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.763677 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.763693 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.763705 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.771959 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.775617 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.775654 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.775662 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.775676 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.775685 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.784404 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.787093 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.787407 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.787489 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.787590 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.787690 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.795673 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:43 crc kubenswrapper[4839]: E0321 04:24:43.795781 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.798107 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.798133 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.798142 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.798158 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.798168 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.900223 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.900295 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.900307 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.900328 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:43 crc kubenswrapper[4839]: I0321 04:24:43.900346 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:43Z","lastTransitionTime":"2026-03-21T04:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.003297 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.003878 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.003964 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.004082 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.004165 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:44Z","lastTransitionTime":"2026-03-21T04:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.108098 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.108138 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.108148 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.108163 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.108172 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:44Z","lastTransitionTime":"2026-03-21T04:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.211452 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.211527 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.211547 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.211606 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.211624 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:44Z","lastTransitionTime":"2026-03-21T04:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.314174 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.314238 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.314257 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.314280 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.314297 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:44Z","lastTransitionTime":"2026-03-21T04:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.417446 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.417520 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.417535 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.417559 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.417589 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:44Z","lastTransitionTime":"2026-03-21T04:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.520880 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.520973 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.520995 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.521030 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.521054 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:44Z","lastTransitionTime":"2026-03-21T04:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.624148 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.624235 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.624253 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.624284 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.624308 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:44Z","lastTransitionTime":"2026-03-21T04:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.730220 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.730267 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.730279 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.730296 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.730307 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:44Z","lastTransitionTime":"2026-03-21T04:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.839517 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.839600 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.839615 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.839634 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.839647 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:44Z","lastTransitionTime":"2026-03-21T04:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.942708 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.942779 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.942797 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.942828 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:44 crc kubenswrapper[4839]: I0321 04:24:44.942851 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:44Z","lastTransitionTime":"2026-03-21T04:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.046108 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.046194 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.046217 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.046254 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.046275 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:45Z","lastTransitionTime":"2026-03-21T04:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.064860 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.064975 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.065029 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.065156 4839 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.065220 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:49.065201221 +0000 UTC m=+93.392987897 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.065409 4839 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.065507 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:49.06549698 +0000 UTC m=+93.393283656 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.065661 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:24:49.065651095 +0000 UTC m=+93.393437781 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.149132 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.149373 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.149447 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.149512 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.149599 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:45Z","lastTransitionTime":"2026-03-21T04:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.165657 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.165747 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.165912 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.165936 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.165965 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.165971 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.165986 4839 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.165993 4839 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.166083 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:49.166059726 +0000 UTC m=+93.493846432 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.166561 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:49.166544241 +0000 UTC m=+93.494330977 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.252384 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.252451 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.252465 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.252485 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.252496 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:45Z","lastTransitionTime":"2026-03-21T04:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.354721 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.354773 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.354783 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.354797 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.354807 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:45Z","lastTransitionTime":"2026-03-21T04:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.452118 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.452148 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.452196 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.452278 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.452367 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:24:45 crc kubenswrapper[4839]: E0321 04:24:45.452435 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.456637 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.456674 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.456688 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.456706 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.456719 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:45Z","lastTransitionTime":"2026-03-21T04:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.559121 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.559186 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.559195 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.559211 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.559220 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:45Z","lastTransitionTime":"2026-03-21T04:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.661384 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.661446 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.661458 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.661475 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.661487 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:45Z","lastTransitionTime":"2026-03-21T04:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.763672 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.763720 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.763732 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.763748 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.763760 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:45Z","lastTransitionTime":"2026-03-21T04:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.865524 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.865609 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.865622 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.865635 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.865644 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:45Z","lastTransitionTime":"2026-03-21T04:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.968117 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.968193 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.968213 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.968239 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:45 crc kubenswrapper[4839]: I0321 04:24:45.968258 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:45Z","lastTransitionTime":"2026-03-21T04:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.071081 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.071206 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.071235 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.071274 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.071302 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:46Z","lastTransitionTime":"2026-03-21T04:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.174169 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.174240 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.174252 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.174268 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.174280 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:46Z","lastTransitionTime":"2026-03-21T04:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.276297 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.276334 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.276343 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.276356 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.276364 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:46Z","lastTransitionTime":"2026-03-21T04:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.378489 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.378533 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.378541 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.378555 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.378579 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:46Z","lastTransitionTime":"2026-03-21T04:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.465222 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.475210 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.481609 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.481789 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.481967 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.482122 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.482260 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:46Z","lastTransitionTime":"2026-03-21T04:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.489291 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.500661 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.510840 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.520473 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.584272 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.584305 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.584313 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.584327 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.584337 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:46Z","lastTransitionTime":"2026-03-21T04:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.686471 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.686511 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.686520 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.686534 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.686544 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:46Z","lastTransitionTime":"2026-03-21T04:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.788824 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.788871 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.788881 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.788898 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.788910 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:46Z","lastTransitionTime":"2026-03-21T04:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.890681 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.890722 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.890734 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.890751 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.890762 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:46Z","lastTransitionTime":"2026-03-21T04:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.993345 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.993389 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.993405 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.993423 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:46 crc kubenswrapper[4839]: I0321 04:24:46.993433 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:46Z","lastTransitionTime":"2026-03-21T04:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.095224 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.095263 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.095272 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.095286 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.095296 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:47Z","lastTransitionTime":"2026-03-21T04:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.196939 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.196985 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.197010 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.197024 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.197033 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:47Z","lastTransitionTime":"2026-03-21T04:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.299404 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.299768 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.299873 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.300003 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.300097 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:47Z","lastTransitionTime":"2026-03-21T04:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.405213 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.405265 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.405284 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.405301 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.405314 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:47Z","lastTransitionTime":"2026-03-21T04:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.452235 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.452319 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:47 crc kubenswrapper[4839]: E0321 04:24:47.452397 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.452478 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:47 crc kubenswrapper[4839]: E0321 04:24:47.452532 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:24:47 crc kubenswrapper[4839]: E0321 04:24:47.452710 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.463220 4839 scope.go:117] "RemoveContainer" containerID="ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.463381 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:24:47 crc kubenswrapper[4839]: E0321 04:24:47.463540 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.507238 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.507291 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.507305 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.507318 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.507327 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:47Z","lastTransitionTime":"2026-03-21T04:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.609191 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.609245 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.609270 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.609292 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.609309 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:47Z","lastTransitionTime":"2026-03-21T04:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.711867 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.711918 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.711928 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.711942 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.711953 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:47Z","lastTransitionTime":"2026-03-21T04:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.813780 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.813827 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.813839 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.813854 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.813865 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:47Z","lastTransitionTime":"2026-03-21T04:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.916448 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.916495 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.916505 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.916521 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.916531 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:47Z","lastTransitionTime":"2026-03-21T04:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:47 crc kubenswrapper[4839]: I0321 04:24:47.963124 4839 scope.go:117] "RemoveContainer" containerID="ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66" Mar 21 04:24:47 crc kubenswrapper[4839]: E0321 04:24:47.963267 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.018539 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.018588 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.018606 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.018626 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.018636 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:48Z","lastTransitionTime":"2026-03-21T04:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.120921 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.120956 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.120967 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.120982 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.120993 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:48Z","lastTransitionTime":"2026-03-21T04:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.222639 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.222689 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.222703 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.222723 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.222734 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:48Z","lastTransitionTime":"2026-03-21T04:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.325330 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.325383 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.325395 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.325413 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.325425 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:48Z","lastTransitionTime":"2026-03-21T04:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.427270 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.427336 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.427352 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.427373 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.427397 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:48Z","lastTransitionTime":"2026-03-21T04:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.529363 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.529407 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.529418 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.529435 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.529446 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:48Z","lastTransitionTime":"2026-03-21T04:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.631997 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.632042 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.632056 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.632073 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.632084 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:48Z","lastTransitionTime":"2026-03-21T04:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.734552 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.734626 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.734644 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.734667 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.734682 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:48Z","lastTransitionTime":"2026-03-21T04:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.837108 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.837170 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.837187 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.837208 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.837225 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:48Z","lastTransitionTime":"2026-03-21T04:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.939960 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.940012 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.940026 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.940047 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:48 crc kubenswrapper[4839]: I0321 04:24:48.940063 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:48Z","lastTransitionTime":"2026-03-21T04:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.043191 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.043262 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.043279 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.043303 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.043320 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:49Z","lastTransitionTime":"2026-03-21T04:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.097407 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.097544 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.097635 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:24:57.097557346 +0000 UTC m=+101.425344072 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.097700 4839 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.097757 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.097779 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:57.097757522 +0000 UTC m=+101.425544238 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.097920 4839 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.097986 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:57.097972249 +0000 UTC m=+101.425758955 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.145774 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.145849 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.145875 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.145905 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.145931 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:49Z","lastTransitionTime":"2026-03-21T04:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.199214 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.199285 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.199451 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.199507 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.199526 4839 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.199460 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.199646 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.199616 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:57.199564127 +0000 UTC m=+101.527350803 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.199667 4839 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.199799 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:24:57.199771433 +0000 UTC m=+101.527558139 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.247873 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.247911 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.247921 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.247937 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.247949 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:49Z","lastTransitionTime":"2026-03-21T04:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.350400 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.350437 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.350446 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.350460 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.350470 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:49Z","lastTransitionTime":"2026-03-21T04:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.451712 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.451771 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.451890 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.451792 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.451993 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:24:49 crc kubenswrapper[4839]: E0321 04:24:49.452202 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.453227 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.453289 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.453315 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.453348 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.453366 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:49Z","lastTransitionTime":"2026-03-21T04:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.555305 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.555365 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.555381 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.555404 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.555420 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:49Z","lastTransitionTime":"2026-03-21T04:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.657233 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.657297 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.657311 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.657328 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.657341 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:49Z","lastTransitionTime":"2026-03-21T04:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.759196 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.759253 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.759271 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.759289 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.759300 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:49Z","lastTransitionTime":"2026-03-21T04:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.862172 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.862209 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.862218 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.862232 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.862243 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:49Z","lastTransitionTime":"2026-03-21T04:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.968474 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.968536 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.968547 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.968562 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:49 crc kubenswrapper[4839]: I0321 04:24:49.968594 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:49Z","lastTransitionTime":"2026-03-21T04:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.070878 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.070918 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.070927 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.070942 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.070951 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:50Z","lastTransitionTime":"2026-03-21T04:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.173259 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.173359 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.173433 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.173457 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.173469 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:50Z","lastTransitionTime":"2026-03-21T04:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.276560 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.276624 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.276635 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.276660 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.276672 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:50Z","lastTransitionTime":"2026-03-21T04:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.379317 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.379376 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.379390 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.379412 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.379426 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:50Z","lastTransitionTime":"2026-03-21T04:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.481396 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.481437 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.481445 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.481458 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.481468 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:50Z","lastTransitionTime":"2026-03-21T04:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.584558 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.584677 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.584697 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.584727 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.584747 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:50Z","lastTransitionTime":"2026-03-21T04:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.688275 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.688363 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.688373 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.688391 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.688401 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:50Z","lastTransitionTime":"2026-03-21T04:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.790810 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.790869 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.790880 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.790894 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.790902 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:50Z","lastTransitionTime":"2026-03-21T04:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.893408 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.893438 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.893445 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.893458 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.893468 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:50Z","lastTransitionTime":"2026-03-21T04:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.994890 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.994950 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.994960 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.994974 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:50 crc kubenswrapper[4839]: I0321 04:24:50.994985 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:50Z","lastTransitionTime":"2026-03-21T04:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.097278 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.097319 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.097331 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.097346 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.097357 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:51Z","lastTransitionTime":"2026-03-21T04:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.200230 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.200271 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.200282 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.200299 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.200309 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:51Z","lastTransitionTime":"2026-03-21T04:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.301981 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.302024 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.302033 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.302048 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.302058 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:51Z","lastTransitionTime":"2026-03-21T04:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.403691 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.403732 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.403743 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.403756 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.403765 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:51Z","lastTransitionTime":"2026-03-21T04:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.452156 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.452200 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.452173 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:51 crc kubenswrapper[4839]: E0321 04:24:51.452276 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:24:51 crc kubenswrapper[4839]: E0321 04:24:51.452377 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:24:51 crc kubenswrapper[4839]: E0321 04:24:51.452452 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.506358 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.506410 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.506420 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.506435 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.506446 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:51Z","lastTransitionTime":"2026-03-21T04:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.608442 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.608493 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.608503 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.608518 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.608528 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:51Z","lastTransitionTime":"2026-03-21T04:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.711736 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.711769 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.711777 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.711790 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.711798 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:51Z","lastTransitionTime":"2026-03-21T04:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.814512 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.814583 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.814598 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.814621 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.814635 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:51Z","lastTransitionTime":"2026-03-21T04:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.916953 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.917009 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.917019 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.917034 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:51 crc kubenswrapper[4839]: I0321 04:24:51.917046 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:51Z","lastTransitionTime":"2026-03-21T04:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.019909 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.019952 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.019961 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.019975 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.019985 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:52Z","lastTransitionTime":"2026-03-21T04:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.122775 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.122831 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.122842 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.122857 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.122865 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:52Z","lastTransitionTime":"2026-03-21T04:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.225628 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.225692 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.225705 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.225719 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.225728 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:52Z","lastTransitionTime":"2026-03-21T04:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.327302 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.327335 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.327344 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.327359 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.327368 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:52Z","lastTransitionTime":"2026-03-21T04:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.430394 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.430437 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.430446 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.430461 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.430471 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:52Z","lastTransitionTime":"2026-03-21T04:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.533672 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.533719 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.533729 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.533744 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.533754 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:52Z","lastTransitionTime":"2026-03-21T04:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.636055 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.636147 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.636175 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.636209 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.636234 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:52Z","lastTransitionTime":"2026-03-21T04:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.739222 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.739282 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.739294 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.739316 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.739329 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:52Z","lastTransitionTime":"2026-03-21T04:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.843382 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.843484 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.843500 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.843526 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.843544 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:52Z","lastTransitionTime":"2026-03-21T04:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.945198 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.945248 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.945263 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.945278 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:52 crc kubenswrapper[4839]: I0321 04:24:52.945287 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:52Z","lastTransitionTime":"2026-03-21T04:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.047786 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.047849 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.047867 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.047891 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.047912 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.151481 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.151520 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.151531 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.151547 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.151558 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.254462 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.255867 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.256027 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.256218 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.256420 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.358797 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.358845 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.358855 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.358873 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.358884 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.452197 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:53 crc kubenswrapper[4839]: E0321 04:24:53.452339 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.452443 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:53 crc kubenswrapper[4839]: E0321 04:24:53.452655 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.453069 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:53 crc kubenswrapper[4839]: E0321 04:24:53.453504 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.461465 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.461527 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.461545 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.461594 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.461612 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.475466 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.564046 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.564097 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.564113 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.564136 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.564154 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.666762 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.666825 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.666841 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.666867 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.666885 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.770017 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.770060 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.770068 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.770083 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.770092 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.877932 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.878011 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.878041 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.878072 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.878092 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.897400 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.897427 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.897436 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.897446 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.897471 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: E0321 04:24:53.913747 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.919035 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.919099 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.919109 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.919126 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.919136 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: E0321 04:24:53.932773 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.938153 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.938202 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.938213 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.938227 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.938238 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: E0321 04:24:53.951777 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.955377 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.955403 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.955412 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.955424 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.955433 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: E0321 04:24:53.968929 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.973046 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.973074 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.973081 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.973092 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.973102 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:53 crc kubenswrapper[4839]: E0321 04:24:53.987453 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:53 crc kubenswrapper[4839]: E0321 04:24:53.987640 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.989204 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.989232 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.989240 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.989255 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:53 crc kubenswrapper[4839]: I0321 04:24:53.989265 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:53Z","lastTransitionTime":"2026-03-21T04:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.092007 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.092060 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.092077 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.092100 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.092118 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:54Z","lastTransitionTime":"2026-03-21T04:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.194998 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.195063 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.195085 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.195115 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.195135 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:54Z","lastTransitionTime":"2026-03-21T04:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.298008 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.298099 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.298114 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.298131 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.298143 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:54Z","lastTransitionTime":"2026-03-21T04:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.400499 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.400606 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.400631 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.400666 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.400766 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:54Z","lastTransitionTime":"2026-03-21T04:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.503166 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.503213 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.503222 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.503237 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.503246 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:54Z","lastTransitionTime":"2026-03-21T04:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.605861 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.605912 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.605945 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.605958 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.605969 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:54Z","lastTransitionTime":"2026-03-21T04:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.708093 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.708123 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.708132 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.708147 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.708157 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:54Z","lastTransitionTime":"2026-03-21T04:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.810308 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.810343 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.810351 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.810363 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.810373 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:54Z","lastTransitionTime":"2026-03-21T04:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.913131 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.913172 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.913182 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.913197 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:54 crc kubenswrapper[4839]: I0321 04:24:54.913206 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:54Z","lastTransitionTime":"2026-03-21T04:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.015251 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.015305 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.015314 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.015327 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.015336 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:55Z","lastTransitionTime":"2026-03-21T04:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.117980 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.118013 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.118022 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.118036 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.118045 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:55Z","lastTransitionTime":"2026-03-21T04:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.220356 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.220435 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.220454 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.220477 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.220494 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:55Z","lastTransitionTime":"2026-03-21T04:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.322522 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.322617 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.322631 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.322653 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.322665 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:55Z","lastTransitionTime":"2026-03-21T04:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.424886 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.424927 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.424936 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.424950 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.424958 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:55Z","lastTransitionTime":"2026-03-21T04:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.452547 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.452556 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.452640 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:55 crc kubenswrapper[4839]: E0321 04:24:55.452775 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:24:55 crc kubenswrapper[4839]: E0321 04:24:55.453138 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:24:55 crc kubenswrapper[4839]: E0321 04:24:55.453275 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:24:55 crc kubenswrapper[4839]: E0321 04:24:55.454797 4839 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:24:55 crc kubenswrapper[4839]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 21 04:24:55 crc kubenswrapper[4839]: if [[ -f "/env/_master" ]]; then Mar 21 04:24:55 crc kubenswrapper[4839]: set -o allexport Mar 21 04:24:55 crc kubenswrapper[4839]: source "/env/_master" Mar 21 04:24:55 crc kubenswrapper[4839]: set +o allexport Mar 21 04:24:55 crc kubenswrapper[4839]: fi Mar 21 04:24:55 crc kubenswrapper[4839]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 21 04:24:55 crc kubenswrapper[4839]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 21 04:24:55 crc kubenswrapper[4839]: ho_enable="--enable-hybrid-overlay" Mar 21 04:24:55 crc kubenswrapper[4839]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 21 04:24:55 crc kubenswrapper[4839]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 21 04:24:55 crc kubenswrapper[4839]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 21 04:24:55 crc kubenswrapper[4839]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 21 04:24:55 crc kubenswrapper[4839]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 21 04:24:55 crc kubenswrapper[4839]: --webhook-host=127.0.0.1 \ Mar 21 04:24:55 crc kubenswrapper[4839]: --webhook-port=9743 \ Mar 21 04:24:55 crc kubenswrapper[4839]: ${ho_enable} \ Mar 21 04:24:55 crc kubenswrapper[4839]: --enable-interconnect \ Mar 21 04:24:55 crc kubenswrapper[4839]: --disable-approver \ Mar 21 04:24:55 crc kubenswrapper[4839]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 21 04:24:55 crc kubenswrapper[4839]: --wait-for-kubernetes-api=200s \ Mar 21 04:24:55 crc kubenswrapper[4839]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 21 04:24:55 crc kubenswrapper[4839]: --loglevel="${LOGLEVEL}" Mar 21 04:24:55 crc kubenswrapper[4839]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:24:55 crc kubenswrapper[4839]: > logger="UnhandledError" Mar 21 04:24:55 crc kubenswrapper[4839]: E0321 04:24:55.458425 4839 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:24:55 crc kubenswrapper[4839]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 21 04:24:55 crc kubenswrapper[4839]: if [[ -f "/env/_master" ]]; then Mar 21 04:24:55 crc kubenswrapper[4839]: set -o allexport Mar 21 04:24:55 crc kubenswrapper[4839]: source "/env/_master" Mar 21 04:24:55 crc kubenswrapper[4839]: set +o allexport Mar 21 04:24:55 crc kubenswrapper[4839]: fi Mar 21 04:24:55 crc kubenswrapper[4839]: Mar 21 04:24:55 crc kubenswrapper[4839]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 21 04:24:55 crc kubenswrapper[4839]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 21 04:24:55 crc kubenswrapper[4839]: --disable-webhook \ Mar 21 04:24:55 crc kubenswrapper[4839]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 21 04:24:55 crc kubenswrapper[4839]: --loglevel="${LOGLEVEL}" Mar 21 04:24:55 crc kubenswrapper[4839]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:24:55 crc kubenswrapper[4839]: > logger="UnhandledError" Mar 21 04:24:55 crc kubenswrapper[4839]: E0321 04:24:55.459505 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.523258 4839 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.527444 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.527470 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.527481 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.527498 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.527508 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:55Z","lastTransitionTime":"2026-03-21T04:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.629491 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.629519 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.629527 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.629539 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.629548 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:55Z","lastTransitionTime":"2026-03-21T04:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.732217 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.732276 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.732299 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.732324 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.732344 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:55Z","lastTransitionTime":"2026-03-21T04:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.834429 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.834498 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.834510 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.834542 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.834554 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:55Z","lastTransitionTime":"2026-03-21T04:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.937425 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.937486 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.937504 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.937530 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:55 crc kubenswrapper[4839]: I0321 04:24:55.937548 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:55Z","lastTransitionTime":"2026-03-21T04:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.040505 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.040590 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.040610 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.040633 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.040649 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:56Z","lastTransitionTime":"2026-03-21T04:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.153807 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.153866 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.153878 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.153896 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.153909 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:56Z","lastTransitionTime":"2026-03-21T04:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.257047 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.257118 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.257130 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.257148 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.257163 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:56Z","lastTransitionTime":"2026-03-21T04:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.359412 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.359441 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.359449 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.359461 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.359469 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:56Z","lastTransitionTime":"2026-03-21T04:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:56 crc kubenswrapper[4839]: E0321 04:24:56.454604 4839 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:24:56 crc kubenswrapper[4839]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 21 04:24:56 crc kubenswrapper[4839]: set -o allexport Mar 21 04:24:56 crc kubenswrapper[4839]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 21 04:24:56 crc kubenswrapper[4839]: source /etc/kubernetes/apiserver-url.env Mar 21 04:24:56 crc kubenswrapper[4839]: else Mar 21 04:24:56 crc kubenswrapper[4839]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 21 04:24:56 crc kubenswrapper[4839]: exit 1 Mar 21 04:24:56 crc kubenswrapper[4839]: fi Mar 21 04:24:56 crc kubenswrapper[4839]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 21 04:24:56 crc kubenswrapper[4839]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 21 04:24:56 crc kubenswrapper[4839]: > logger="UnhandledError" Mar 21 04:24:56 crc kubenswrapper[4839]: E0321 04:24:56.454817 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 21 04:24:56 crc kubenswrapper[4839]: E0321 04:24:56.455977 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 21 04:24:56 crc kubenswrapper[4839]: E0321 04:24:56.456018 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.460996 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.461027 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.461035 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.461049 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.461059 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:56Z","lastTransitionTime":"2026-03-21T04:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.465463 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.474417 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.484112 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.493278 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.503623 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.508349 4839 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.519121 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.528977 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.540998 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.563457 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.563495 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.563506 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.563523 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.563537 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:56Z","lastTransitionTime":"2026-03-21T04:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.666611 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.666661 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.666674 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.666693 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.666704 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:56Z","lastTransitionTime":"2026-03-21T04:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.768992 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.769247 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.769368 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.769461 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.769550 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:56Z","lastTransitionTime":"2026-03-21T04:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.872296 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.872677 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.872804 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.872967 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.873085 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:56Z","lastTransitionTime":"2026-03-21T04:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.976221 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.976283 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.976310 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.976339 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:56 crc kubenswrapper[4839]: I0321 04:24:56.976362 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:56Z","lastTransitionTime":"2026-03-21T04:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.078816 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.078882 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.078900 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.078924 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.078942 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:57Z","lastTransitionTime":"2026-03-21T04:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.170443 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.170521 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.170650 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:25:13.170611285 +0000 UTC m=+117.498397991 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.170648 4839 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.170744 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.170847 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:13.170828332 +0000 UTC m=+117.498615098 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.170973 4839 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.171107 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:13.17107739 +0000 UTC m=+117.498864106 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.181995 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.182039 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.182051 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.182067 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.182078 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:57Z","lastTransitionTime":"2026-03-21T04:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.271964 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.272005 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.272103 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.272120 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.272121 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.272169 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.272187 4839 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.272130 4839 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.272253 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:13.272232144 +0000 UTC m=+117.600018850 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.272277 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:13.272266255 +0000 UTC m=+117.600052961 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.284745 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.284813 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.284831 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.284858 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.284877 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:57Z","lastTransitionTime":"2026-03-21T04:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.388244 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.388315 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.388334 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.388361 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.388383 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:57Z","lastTransitionTime":"2026-03-21T04:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.452410 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.452493 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.452558 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.452413 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.452682 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:24:57 crc kubenswrapper[4839]: E0321 04:24:57.452681 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.491033 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.491076 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.491085 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.491100 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.491109 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:57Z","lastTransitionTime":"2026-03-21T04:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.593551 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.593627 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.593644 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.593666 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.593694 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:57Z","lastTransitionTime":"2026-03-21T04:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.696543 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.696600 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.696613 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.696636 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.696648 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:57Z","lastTransitionTime":"2026-03-21T04:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.799689 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.799753 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.799770 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.799794 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.799810 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:57Z","lastTransitionTime":"2026-03-21T04:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.902950 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.902984 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.902993 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.903006 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:57 crc kubenswrapper[4839]: I0321 04:24:57.903015 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:57Z","lastTransitionTime":"2026-03-21T04:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.005485 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.005597 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.005616 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.005639 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.005656 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:58Z","lastTransitionTime":"2026-03-21T04:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.107988 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.108057 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.108109 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.108132 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.108149 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:58Z","lastTransitionTime":"2026-03-21T04:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.211062 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.211112 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.211124 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.211138 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.211150 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:58Z","lastTransitionTime":"2026-03-21T04:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.313911 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.313965 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.313977 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.313997 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.314009 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:58Z","lastTransitionTime":"2026-03-21T04:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.415739 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.415793 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.415805 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.415822 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.415834 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:58Z","lastTransitionTime":"2026-03-21T04:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.453129 4839 scope.go:117] "RemoveContainer" containerID="ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.518040 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.518103 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.518122 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.518147 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.518165 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:58Z","lastTransitionTime":"2026-03-21T04:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.621284 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.621681 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.621694 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.621714 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.621725 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:58Z","lastTransitionTime":"2026-03-21T04:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.724496 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.724562 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.724620 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.724649 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.724671 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:58Z","lastTransitionTime":"2026-03-21T04:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.827243 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.827307 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.827335 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.827365 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.827389 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:58Z","lastTransitionTime":"2026-03-21T04:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.930481 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.930548 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.930589 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.930622 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.930638 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:58Z","lastTransitionTime":"2026-03-21T04:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.990399 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.991702 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87"} Mar 21 04:24:58 crc kubenswrapper[4839]: I0321 04:24:58.992058 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.012121 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.023890 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.031779 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.033107 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.033141 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.033153 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.033170 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.033181 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:59Z","lastTransitionTime":"2026-03-21T04:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.040988 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.048924 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.056231 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.063231 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.072103 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.135474 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.135508 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.135517 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.135531 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.135540 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:59Z","lastTransitionTime":"2026-03-21T04:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.237166 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.237199 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.237208 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.237220 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.237229 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:59Z","lastTransitionTime":"2026-03-21T04:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.340153 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.340198 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.340209 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.340226 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.340237 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:59Z","lastTransitionTime":"2026-03-21T04:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.443003 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.443045 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.443056 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.443073 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.443084 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:59Z","lastTransitionTime":"2026-03-21T04:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.452409 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.452432 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.452560 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:24:59 crc kubenswrapper[4839]: E0321 04:24:59.452662 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:24:59 crc kubenswrapper[4839]: E0321 04:24:59.452861 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:24:59 crc kubenswrapper[4839]: E0321 04:24:59.452949 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.546666 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.546694 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.546703 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.546719 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.546727 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:59Z","lastTransitionTime":"2026-03-21T04:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.649827 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.649875 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.649891 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.649909 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.649921 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:59Z","lastTransitionTime":"2026-03-21T04:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.752410 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.752464 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.752478 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.752494 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.752503 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:59Z","lastTransitionTime":"2026-03-21T04:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.855184 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.855232 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.855243 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.855257 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.855269 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:59Z","lastTransitionTime":"2026-03-21T04:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.957861 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.957894 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.957902 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.957915 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:24:59 crc kubenswrapper[4839]: I0321 04:24:59.957925 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:24:59Z","lastTransitionTime":"2026-03-21T04:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.060412 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.060468 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.060499 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.060517 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.060527 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:00Z","lastTransitionTime":"2026-03-21T04:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.163113 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.163160 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.163171 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.163185 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.163196 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:00Z","lastTransitionTime":"2026-03-21T04:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.265759 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.266141 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.266155 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.266170 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.266180 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:00Z","lastTransitionTime":"2026-03-21T04:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.278645 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-g47qh"] Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.279117 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g47qh" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.281454 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.281669 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.281540 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.299229 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.310139 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.318608 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.337531 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.349339 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.357453 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.366518 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.368680 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.368718 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.368728 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.368743 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.368754 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:00Z","lastTransitionTime":"2026-03-21T04:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.376672 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.382840 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.398453 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljz2v\" (UniqueName: \"kubernetes.io/projected/e646dbcd-c976-48e4-8dee-497be8a275bf-kube-api-access-ljz2v\") pod \"node-resolver-g47qh\" (UID: \"e646dbcd-c976-48e4-8dee-497be8a275bf\") " pod="openshift-dns/node-resolver-g47qh" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.398547 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e646dbcd-c976-48e4-8dee-497be8a275bf-hosts-file\") pod \"node-resolver-g47qh\" (UID: \"e646dbcd-c976-48e4-8dee-497be8a275bf\") " pod="openshift-dns/node-resolver-g47qh" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.470853 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.470947 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.470971 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.471012 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.471041 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:00Z","lastTransitionTime":"2026-03-21T04:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.499067 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljz2v\" (UniqueName: \"kubernetes.io/projected/e646dbcd-c976-48e4-8dee-497be8a275bf-kube-api-access-ljz2v\") pod \"node-resolver-g47qh\" (UID: \"e646dbcd-c976-48e4-8dee-497be8a275bf\") " pod="openshift-dns/node-resolver-g47qh" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.499125 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e646dbcd-c976-48e4-8dee-497be8a275bf-hosts-file\") pod \"node-resolver-g47qh\" (UID: \"e646dbcd-c976-48e4-8dee-497be8a275bf\") " pod="openshift-dns/node-resolver-g47qh" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.499217 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e646dbcd-c976-48e4-8dee-497be8a275bf-hosts-file\") pod \"node-resolver-g47qh\" (UID: \"e646dbcd-c976-48e4-8dee-497be8a275bf\") " pod="openshift-dns/node-resolver-g47qh" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.516375 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljz2v\" (UniqueName: \"kubernetes.io/projected/e646dbcd-c976-48e4-8dee-497be8a275bf-kube-api-access-ljz2v\") pod \"node-resolver-g47qh\" (UID: \"e646dbcd-c976-48e4-8dee-497be8a275bf\") " pod="openshift-dns/node-resolver-g47qh" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.573930 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.574171 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.574300 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.574398 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.574492 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:00Z","lastTransitionTime":"2026-03-21T04:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.603874 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g47qh" Mar 21 04:25:00 crc kubenswrapper[4839]: W0321 04:25:00.624947 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode646dbcd_c976_48e4_8dee_497be8a275bf.slice/crio-ad11ae6b1f76dc4d610d074d746058d2d1171fb3a53b621a66c771b9c432c307 WatchSource:0}: Error finding container ad11ae6b1f76dc4d610d074d746058d2d1171fb3a53b621a66c771b9c432c307: Status 404 returned error can't find the container with id ad11ae6b1f76dc4d610d074d746058d2d1171fb3a53b621a66c771b9c432c307 Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.649348 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-jx4q7"] Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.650106 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.653053 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.653447 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.653814 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.653989 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-scp2c"] Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.654339 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.655036 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.655892 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.656626 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-zqcw4"] Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.657166 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.658229 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.658445 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.658444 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.658752 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.658833 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.660804 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.661120 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.667012 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.677653 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.679439 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.679469 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.679478 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.679492 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.679503 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:00Z","lastTransitionTime":"2026-03-21T04:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.691247 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.704210 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.737879 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.755049 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.782117 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.782162 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.782200 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.782218 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.782229 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:00Z","lastTransitionTime":"2026-03-21T04:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.786190 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.796903 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801729 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-mcd-auth-proxy-config\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801776 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jqbw\" (UniqueName: \"kubernetes.io/projected/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-kube-api-access-9jqbw\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801799 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1602189b-f4f3-40ee-ba63-c695c11069d0-cni-binary-copy\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801821 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-cnibin\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801842 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-os-release\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801864 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-var-lib-cni-multus\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801884 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-var-lib-kubelet\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801904 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-conf-dir\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801923 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-rootfs\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801943 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-var-lib-cni-bin\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801981 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-cnibin\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.801999 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-proxy-tls\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802017 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-system-cni-dir\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802037 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802062 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-cni-dir\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802081 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-etc-kubernetes\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802104 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e0848faa-daf7-4b62-a20f-36d92678db1d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802123 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-run-k8s-cni-cncf-io\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802143 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0848faa-daf7-4b62-a20f-36d92678db1d-cni-binary-copy\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802173 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9gh6\" (UniqueName: \"kubernetes.io/projected/1602189b-f4f3-40ee-ba63-c695c11069d0-kube-api-access-f9gh6\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802191 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-os-release\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802208 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-hostroot\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802230 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-socket-dir-parent\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802244 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-run-netns\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802259 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-daemon-config\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802315 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-system-cni-dir\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802378 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-run-multus-certs\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.802439 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6trk\" (UniqueName: \"kubernetes.io/projected/e0848faa-daf7-4b62-a20f-36d92678db1d-kube-api-access-v6trk\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.807182 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.815190 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.823711 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.833062 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.844772 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.856433 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.875232 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.885174 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.887741 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.887777 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.887785 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.887800 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.887813 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:00Z","lastTransitionTime":"2026-03-21T04:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.896962 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903414 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0848faa-daf7-4b62-a20f-36d92678db1d-cni-binary-copy\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903470 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9gh6\" (UniqueName: \"kubernetes.io/projected/1602189b-f4f3-40ee-ba63-c695c11069d0-kube-api-access-f9gh6\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903491 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-os-release\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903512 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-hostroot\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903544 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-socket-dir-parent\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903584 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-run-netns\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903606 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-daemon-config\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903630 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6trk\" (UniqueName: \"kubernetes.io/projected/e0848faa-daf7-4b62-a20f-36d92678db1d-kube-api-access-v6trk\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903689 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-system-cni-dir\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903711 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-run-multus-certs\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903757 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1602189b-f4f3-40ee-ba63-c695c11069d0-cni-binary-copy\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903780 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-mcd-auth-proxy-config\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903803 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jqbw\" (UniqueName: \"kubernetes.io/projected/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-kube-api-access-9jqbw\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903826 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-os-release\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903846 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-var-lib-cni-multus\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903867 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-cnibin\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903890 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-var-lib-kubelet\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903912 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-conf-dir\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903932 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-var-lib-cni-bin\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903954 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-rootfs\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.903975 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-cnibin\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904004 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-proxy-tls\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904027 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-system-cni-dir\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904053 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904076 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-cni-dir\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904100 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e0848faa-daf7-4b62-a20f-36d92678db1d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904123 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-run-k8s-cni-cncf-io\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904143 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-etc-kubernetes\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904213 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-etc-kubernetes\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904294 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-os-release\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904348 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-cnibin\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904415 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-var-lib-cni-multus\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904451 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-cnibin\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904483 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-var-lib-kubelet\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904517 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-conf-dir\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904545 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-var-lib-cni-bin\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904596 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-rootfs\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904651 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-cni-dir\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904684 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-system-cni-dir\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.904931 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0848faa-daf7-4b62-a20f-36d92678db1d-cni-binary-copy\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.905172 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-system-cni-dir\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.905313 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-os-release\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.905350 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-hostroot\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.905384 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-socket-dir-parent\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.905408 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-run-netns\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.905601 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e0848faa-daf7-4b62-a20f-36d92678db1d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.906092 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-mcd-auth-proxy-config\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.906164 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-run-k8s-cni-cncf-io\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.906253 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1602189b-f4f3-40ee-ba63-c695c11069d0-multus-daemon-config\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.906268 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1602189b-f4f3-40ee-ba63-c695c11069d0-host-run-multus-certs\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.906364 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0848faa-daf7-4b62-a20f-36d92678db1d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.906820 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1602189b-f4f3-40ee-ba63-c695c11069d0-cni-binary-copy\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.909491 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-proxy-tls\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.919530 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9gh6\" (UniqueName: \"kubernetes.io/projected/1602189b-f4f3-40ee-ba63-c695c11069d0-kube-api-access-f9gh6\") pod \"multus-zqcw4\" (UID: \"1602189b-f4f3-40ee-ba63-c695c11069d0\") " pod="openshift-multus/multus-zqcw4" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.919960 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.925171 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6trk\" (UniqueName: \"kubernetes.io/projected/e0848faa-daf7-4b62-a20f-36d92678db1d-kube-api-access-v6trk\") pod \"multus-additional-cni-plugins-scp2c\" (UID: \"e0848faa-daf7-4b62-a20f-36d92678db1d\") " pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.928252 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jqbw\" (UniqueName: \"kubernetes.io/projected/4f92fefb-d5cd-451a-8bbe-31eea55d5bd9-kube-api-access-9jqbw\") pod \"machine-config-daemon-jx4q7\" (UID: \"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\") " pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.931295 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.940122 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.947893 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.953987 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.979256 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:25:00 crc kubenswrapper[4839]: W0321 04:25:00.990274 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f92fefb_d5cd_451a_8bbe_31eea55d5bd9.slice/crio-f53f978a84252503a736dfceb91ada96eb5f3287ae5bc5d348304599bfc4be70 WatchSource:0}: Error finding container f53f978a84252503a736dfceb91ada96eb5f3287ae5bc5d348304599bfc4be70: Status 404 returned error can't find the container with id f53f978a84252503a736dfceb91ada96eb5f3287ae5bc5d348304599bfc4be70 Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.990829 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.990861 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.990873 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.990894 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.990907 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:00Z","lastTransitionTime":"2026-03-21T04:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.992058 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-scp2c" Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.996731 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g47qh" event={"ID":"e646dbcd-c976-48e4-8dee-497be8a275bf","Type":"ContainerStarted","Data":"2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.996794 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g47qh" event={"ID":"e646dbcd-c976-48e4-8dee-497be8a275bf","Type":"ContainerStarted","Data":"ad11ae6b1f76dc4d610d074d746058d2d1171fb3a53b621a66c771b9c432c307"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.997929 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"f53f978a84252503a736dfceb91ada96eb5f3287ae5bc5d348304599bfc4be70"} Mar 21 04:25:00 crc kubenswrapper[4839]: I0321 04:25:00.998047 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zqcw4" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.008160 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: W0321 04:25:01.014148 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0848faa_daf7_4b62_a20f_36d92678db1d.slice/crio-811b088a00fd73f9b5eefe3a065a3555e9a32793837e4125cec34d9084a87730 WatchSource:0}: Error finding container 811b088a00fd73f9b5eefe3a065a3555e9a32793837e4125cec34d9084a87730: Status 404 returned error can't find the container with id 811b088a00fd73f9b5eefe3a065a3555e9a32793837e4125cec34d9084a87730 Mar 21 04:25:01 crc kubenswrapper[4839]: W0321 04:25:01.019670 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1602189b_f4f3_40ee_ba63_c695c11069d0.slice/crio-5a140131f05d7b0d663ff416e2780ce265a03da4be693c89cc63345c8b65f3dd WatchSource:0}: Error finding container 5a140131f05d7b0d663ff416e2780ce265a03da4be693c89cc63345c8b65f3dd: Status 404 returned error can't find the container with id 5a140131f05d7b0d663ff416e2780ce265a03da4be693c89cc63345c8b65f3dd Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.019779 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-spl4b"] Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.020741 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.022821 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.023030 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.023200 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.023233 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.023329 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.023439 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.023477 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.024962 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.034485 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.049764 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.067635 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.077049 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.087062 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.093771 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.093805 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.093814 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.093828 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.093837 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:01Z","lastTransitionTime":"2026-03-21T04:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.094042 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.105371 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107645 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-node-log\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107678 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-log-socket\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107695 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-systemd\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107711 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdph2\" (UniqueName: \"kubernetes.io/projected/d634043b-c9ec-4469-b267-26053b1f02f9-kube-api-access-cdph2\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107728 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-ovn-kubernetes\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107741 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-script-lib\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107756 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-slash\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107770 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-kubelet\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107785 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-systemd-units\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107799 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-var-lib-openvswitch\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107830 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107856 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d634043b-c9ec-4469-b267-26053b1f02f9-ovn-node-metrics-cert\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.107957 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-netns\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.108006 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-config\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.108042 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-bin\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.108070 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-env-overrides\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.108130 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-ovn\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.108147 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-openvswitch\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.108196 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-etc-openvswitch\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.108221 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-netd\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.119998 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.131985 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.141400 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.150368 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.158485 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.169433 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.183437 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.194038 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.195387 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.195417 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.195430 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.195445 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.195455 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:01Z","lastTransitionTime":"2026-03-21T04:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.205053 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209528 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-kubelet\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209562 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-systemd-units\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209591 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-var-lib-openvswitch\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209609 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209632 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d634043b-c9ec-4469-b267-26053b1f02f9-ovn-node-metrics-cert\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209648 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-netns\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209662 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-config\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209678 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-bin\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209685 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-var-lib-openvswitch\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209693 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-systemd-units\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209734 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-kubelet\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209693 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-env-overrides\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209761 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-netns\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209782 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209815 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-ovn\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209836 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-openvswitch\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209860 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-etc-openvswitch\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209884 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-netd\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209908 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-node-log\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209922 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-log-socket\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209941 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-systemd\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209955 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdph2\" (UniqueName: \"kubernetes.io/projected/d634043b-c9ec-4469-b267-26053b1f02f9-kube-api-access-cdph2\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209974 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-slash\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.209991 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-ovn-kubernetes\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210012 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-script-lib\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210168 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-env-overrides\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210206 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-bin\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210232 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-node-log\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210250 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-ovn\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210270 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-openvswitch\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210289 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-etc-openvswitch\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210310 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-netd\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210539 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-log-socket\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210583 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-systemd\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210609 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-slash\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210631 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-ovn-kubernetes\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210642 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-script-lib\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.210763 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-config\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.212916 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d634043b-c9ec-4469-b267-26053b1f02f9-ovn-node-metrics-cert\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.219655 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.226016 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdph2\" (UniqueName: \"kubernetes.io/projected/d634043b-c9ec-4469-b267-26053b1f02f9-kube-api-access-cdph2\") pod \"ovnkube-node-spl4b\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.227717 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.243878 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.253139 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.260832 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.268269 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.277149 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.297990 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.298063 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.298082 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.298106 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.298122 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:01Z","lastTransitionTime":"2026-03-21T04:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.376055 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:01 crc kubenswrapper[4839]: W0321 04:25:01.394375 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd634043b_c9ec_4469_b267_26053b1f02f9.slice/crio-f75e324ef6ce35e2f9d2ecb83aad79d37f2471563f3c265cdfc0e67df74a76f1 WatchSource:0}: Error finding container f75e324ef6ce35e2f9d2ecb83aad79d37f2471563f3c265cdfc0e67df74a76f1: Status 404 returned error can't find the container with id f75e324ef6ce35e2f9d2ecb83aad79d37f2471563f3c265cdfc0e67df74a76f1 Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.399467 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.399498 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.399511 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.399530 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.399542 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:01Z","lastTransitionTime":"2026-03-21T04:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.451875 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.451942 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:01 crc kubenswrapper[4839]: E0321 04:25:01.452001 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:01 crc kubenswrapper[4839]: E0321 04:25:01.452070 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.452117 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:01 crc kubenswrapper[4839]: E0321 04:25:01.452175 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.502178 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.502210 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.502221 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.502236 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.502248 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:01Z","lastTransitionTime":"2026-03-21T04:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.605990 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.606077 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.606105 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.606144 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.606170 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:01Z","lastTransitionTime":"2026-03-21T04:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.708804 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.708850 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.708864 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.708878 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.708889 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:01Z","lastTransitionTime":"2026-03-21T04:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.811876 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.811937 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.811955 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.811980 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.811997 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:01Z","lastTransitionTime":"2026-03-21T04:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.915351 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.915417 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.915444 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.915474 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:01 crc kubenswrapper[4839]: I0321 04:25:01.915497 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:01Z","lastTransitionTime":"2026-03-21T04:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.002857 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.002917 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.005487 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c" exitCode=0 Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.005600 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.005640 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"f75e324ef6ce35e2f9d2ecb83aad79d37f2471563f3c265cdfc0e67df74a76f1"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.008545 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqcw4" event={"ID":"1602189b-f4f3-40ee-ba63-c695c11069d0","Type":"ContainerStarted","Data":"abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.008635 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqcw4" event={"ID":"1602189b-f4f3-40ee-ba63-c695c11069d0","Type":"ContainerStarted","Data":"5a140131f05d7b0d663ff416e2780ce265a03da4be693c89cc63345c8b65f3dd"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.012085 4839 generic.go:334] "Generic (PLEG): container finished" podID="e0848faa-daf7-4b62-a20f-36d92678db1d" containerID="5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea" exitCode=0 Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.012127 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" event={"ID":"e0848faa-daf7-4b62-a20f-36d92678db1d","Type":"ContainerDied","Data":"5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.012154 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" event={"ID":"e0848faa-daf7-4b62-a20f-36d92678db1d","Type":"ContainerStarted","Data":"811b088a00fd73f9b5eefe3a065a3555e9a32793837e4125cec34d9084a87730"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.016024 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.018155 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.018197 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.018208 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.018224 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.018235 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:02Z","lastTransitionTime":"2026-03-21T04:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.030449 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.046881 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.061626 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.075415 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.088684 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.112494 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.127803 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.127836 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.127844 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.127858 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.127868 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:02Z","lastTransitionTime":"2026-03-21T04:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.134516 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.146050 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.161030 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.171151 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.179769 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.189450 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.199662 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.211998 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.227098 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.230043 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.230087 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.230100 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.230121 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.230136 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:02Z","lastTransitionTime":"2026-03-21T04:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.237250 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.249722 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.259235 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.272001 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.280263 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.293503 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.304314 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.322810 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.331413 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.332532 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.332715 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.332805 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.332902 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.332987 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:02Z","lastTransitionTime":"2026-03-21T04:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.346555 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.436025 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.436068 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.436082 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.436131 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.436144 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:02Z","lastTransitionTime":"2026-03-21T04:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.538564 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.538647 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.538675 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.538698 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.538712 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:02Z","lastTransitionTime":"2026-03-21T04:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.640842 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.641270 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.641279 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.641294 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.641304 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:02Z","lastTransitionTime":"2026-03-21T04:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.744505 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.744755 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.744887 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.744997 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.745103 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:02Z","lastTransitionTime":"2026-03-21T04:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.848286 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.848352 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.848365 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.848385 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.848398 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:02Z","lastTransitionTime":"2026-03-21T04:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.951420 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.951469 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.951482 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.951507 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:02 crc kubenswrapper[4839]: I0321 04:25:02.951521 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:02Z","lastTransitionTime":"2026-03-21T04:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.020142 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.020192 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.020202 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.020211 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.022101 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" event={"ID":"e0848faa-daf7-4b62-a20f-36d92678db1d","Type":"ContainerStarted","Data":"d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.042246 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.053213 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.054007 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.054060 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.054074 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.054090 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.054137 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:03Z","lastTransitionTime":"2026-03-21T04:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.068187 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.084715 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.094116 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.103495 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.112971 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.122101 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.130470 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.143983 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.153944 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.157018 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.157064 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.157077 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.157095 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.157107 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:03Z","lastTransitionTime":"2026-03-21T04:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.167093 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.177978 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.259657 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.259711 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.259722 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.259742 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.259754 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:03Z","lastTransitionTime":"2026-03-21T04:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.361835 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.361871 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.361883 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.361898 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.361909 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:03Z","lastTransitionTime":"2026-03-21T04:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.451768 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:03 crc kubenswrapper[4839]: E0321 04:25:03.451868 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.451787 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.451768 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:03 crc kubenswrapper[4839]: E0321 04:25:03.451929 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:03 crc kubenswrapper[4839]: E0321 04:25:03.452067 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.464096 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.464139 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.464147 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.464163 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.464172 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:03Z","lastTransitionTime":"2026-03-21T04:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.566947 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.567004 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.567022 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.567045 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.567063 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:03Z","lastTransitionTime":"2026-03-21T04:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.670121 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.670162 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.670171 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.670188 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.670197 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:03Z","lastTransitionTime":"2026-03-21T04:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.773017 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.773068 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.773084 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.773110 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.773126 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:03Z","lastTransitionTime":"2026-03-21T04:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.876189 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.876242 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.876259 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.876283 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.876300 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:03Z","lastTransitionTime":"2026-03-21T04:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.983162 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.983262 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.983285 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.983314 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:03 crc kubenswrapper[4839]: I0321 04:25:03.983334 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:03Z","lastTransitionTime":"2026-03-21T04:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.031007 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.031100 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.033803 4839 generic.go:334] "Generic (PLEG): container finished" podID="e0848faa-daf7-4b62-a20f-36d92678db1d" containerID="d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad" exitCode=0 Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.033856 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" event={"ID":"e0848faa-daf7-4b62-a20f-36d92678db1d","Type":"ContainerDied","Data":"d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.063761 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.077060 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.086818 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.086861 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.086873 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.086889 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.086900 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.098989 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.108592 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.116443 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.126390 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.136897 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.144740 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.153736 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.162894 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.171903 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.181028 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.189161 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.189221 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.189245 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.189276 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.189297 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.193063 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.292377 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.292410 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.292421 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.292436 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.292446 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.321418 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.321489 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.321512 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.321542 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.321594 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: E0321 04:25:04.335639 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.340731 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.340881 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.340907 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.340935 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.340957 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: E0321 04:25:04.356754 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.361071 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.361112 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.361121 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.361137 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.361150 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: E0321 04:25:04.374777 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.379720 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.379794 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.379811 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.379830 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.379912 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: E0321 04:25:04.393929 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.399435 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.399477 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.399487 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.399502 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.399513 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: E0321 04:25:04.408786 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:04 crc kubenswrapper[4839]: E0321 04:25:04.408900 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.410688 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.410716 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.410725 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.410740 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.410749 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.518110 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.518193 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.518221 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.518268 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.518291 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.621753 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.621804 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.621818 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.621837 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.621851 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.724239 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.724282 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.724296 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.724315 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.724329 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.827425 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.827480 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.827496 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.827516 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.827532 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.930976 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.931025 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.931036 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.931053 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:04 crc kubenswrapper[4839]: I0321 04:25:04.931066 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:04Z","lastTransitionTime":"2026-03-21T04:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.034961 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.035041 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.035063 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.035099 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.035119 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:05Z","lastTransitionTime":"2026-03-21T04:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.040028 4839 generic.go:334] "Generic (PLEG): container finished" podID="e0848faa-daf7-4b62-a20f-36d92678db1d" containerID="688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2" exitCode=0 Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.040094 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" event={"ID":"e0848faa-daf7-4b62-a20f-36d92678db1d","Type":"ContainerDied","Data":"688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.061447 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.084605 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.101878 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.116284 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.136947 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.136977 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.136984 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.136997 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.137005 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:05Z","lastTransitionTime":"2026-03-21T04:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.142229 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.154622 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.172143 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.180705 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.192617 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.200313 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.209333 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.217559 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.223849 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.239192 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.239224 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.239235 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.239250 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.239261 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:05Z","lastTransitionTime":"2026-03-21T04:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.341164 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.341494 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.341504 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.341516 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.341526 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:05Z","lastTransitionTime":"2026-03-21T04:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.443363 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.443392 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.443401 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.443413 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.443421 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:05Z","lastTransitionTime":"2026-03-21T04:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.452233 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.452237 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.452310 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:05 crc kubenswrapper[4839]: E0321 04:25:05.452438 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:05 crc kubenswrapper[4839]: E0321 04:25:05.452467 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:05 crc kubenswrapper[4839]: E0321 04:25:05.452515 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.545780 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.545817 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.545827 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.545842 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.545852 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:05Z","lastTransitionTime":"2026-03-21T04:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.648347 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.648383 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.648393 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.648407 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.648418 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:05Z","lastTransitionTime":"2026-03-21T04:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.751371 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.751402 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.751414 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.751429 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.751441 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:05Z","lastTransitionTime":"2026-03-21T04:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.854244 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.854278 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.854287 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.854301 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.854309 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:05Z","lastTransitionTime":"2026-03-21T04:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.956797 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.956856 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.956874 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.956896 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:05 crc kubenswrapper[4839]: I0321 04:25:05.956908 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:05Z","lastTransitionTime":"2026-03-21T04:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.047421 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.051729 4839 generic.go:334] "Generic (PLEG): container finished" podID="e0848faa-daf7-4b62-a20f-36d92678db1d" containerID="137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4" exitCode=0 Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.051789 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" event={"ID":"e0848faa-daf7-4b62-a20f-36d92678db1d","Type":"ContainerDied","Data":"137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.061801 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.061845 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.061857 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.061875 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.061888 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:06Z","lastTransitionTime":"2026-03-21T04:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.068805 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.080826 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.096185 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.108126 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.123556 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.134678 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.146918 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.158477 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.163537 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.163561 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.163586 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.163605 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.163616 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:06Z","lastTransitionTime":"2026-03-21T04:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.170718 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.181105 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.190258 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.203728 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.220081 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.265339 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.265372 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.265380 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.265393 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.265402 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:06Z","lastTransitionTime":"2026-03-21T04:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.367759 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.367822 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.367846 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.367876 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.367897 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:06Z","lastTransitionTime":"2026-03-21T04:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.456016 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:06 crc kubenswrapper[4839]: E0321 04:25:06.456116 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.472767 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.472810 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.472821 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.472835 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.472843 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:06Z","lastTransitionTime":"2026-03-21T04:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.488765 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.497270 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.508297 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.517138 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.527978 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.542813 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.556602 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.563367 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.575158 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.575197 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.575208 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.575224 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.575236 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:06Z","lastTransitionTime":"2026-03-21T04:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.576503 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.587002 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.596441 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.606253 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.612449 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.677686 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.677715 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.677723 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.677736 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.677745 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:06Z","lastTransitionTime":"2026-03-21T04:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.782357 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.782398 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.782410 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.782430 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.782447 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:06Z","lastTransitionTime":"2026-03-21T04:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.816252 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-sxs57"] Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.816636 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.818913 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.818961 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.819009 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.820317 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.834141 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.841824 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.852050 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.862618 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.873341 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.884859 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.884894 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.884930 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.884952 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.884964 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:06Z","lastTransitionTime":"2026-03-21T04:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.893407 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.901635 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.917152 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.928291 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.936306 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.946794 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.957524 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.965533 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.969818 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqm8m\" (UniqueName: \"kubernetes.io/projected/e99177d8-5f41-4cee-a2c9-ae1c314d9d8d-kube-api-access-vqm8m\") pod \"node-ca-sxs57\" (UID: \"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\") " pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.969861 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e99177d8-5f41-4cee-a2c9-ae1c314d9d8d-host\") pod \"node-ca-sxs57\" (UID: \"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\") " pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.969896 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e99177d8-5f41-4cee-a2c9-ae1c314d9d8d-serviceca\") pod \"node-ca-sxs57\" (UID: \"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\") " pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.974803 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.987805 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.987838 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.987846 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.987861 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:06 crc kubenswrapper[4839]: I0321 04:25:06.987869 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:06Z","lastTransitionTime":"2026-03-21T04:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.057791 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" event={"ID":"e0848faa-daf7-4b62-a20f-36d92678db1d","Type":"ContainerStarted","Data":"6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07"} Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.069227 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.070642 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e99177d8-5f41-4cee-a2c9-ae1c314d9d8d-serviceca\") pod \"node-ca-sxs57\" (UID: \"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\") " pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.070732 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqm8m\" (UniqueName: \"kubernetes.io/projected/e99177d8-5f41-4cee-a2c9-ae1c314d9d8d-kube-api-access-vqm8m\") pod \"node-ca-sxs57\" (UID: \"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\") " pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.070761 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e99177d8-5f41-4cee-a2c9-ae1c314d9d8d-host\") pod \"node-ca-sxs57\" (UID: \"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\") " pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.070828 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e99177d8-5f41-4cee-a2c9-ae1c314d9d8d-host\") pod \"node-ca-sxs57\" (UID: \"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\") " pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.071865 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e99177d8-5f41-4cee-a2c9-ae1c314d9d8d-serviceca\") pod \"node-ca-sxs57\" (UID: \"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\") " pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.079159 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.089598 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqm8m\" (UniqueName: \"kubernetes.io/projected/e99177d8-5f41-4cee-a2c9-ae1c314d9d8d-kube-api-access-vqm8m\") pod \"node-ca-sxs57\" (UID: \"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\") " pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.089539 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.090250 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.090289 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.090303 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.090317 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.090330 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:07Z","lastTransitionTime":"2026-03-21T04:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.096801 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.105637 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.113413 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.128027 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.130094 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sxs57" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.144236 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: W0321 04:25:07.147293 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode99177d8_5f41_4cee_a2c9_ae1c314d9d8d.slice/crio-f2aca93f8be5ce0f5e7828afb1727002c39312da949d6da70bc8973ef7174727 WatchSource:0}: Error finding container f2aca93f8be5ce0f5e7828afb1727002c39312da949d6da70bc8973ef7174727: Status 404 returned error can't find the container with id f2aca93f8be5ce0f5e7828afb1727002c39312da949d6da70bc8973ef7174727 Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.158266 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.169703 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.180926 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.190266 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.194237 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.194274 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.194283 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.194296 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.194306 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:07Z","lastTransitionTime":"2026-03-21T04:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.203756 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.211670 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.297812 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.297845 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.297856 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.297877 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.297888 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:07Z","lastTransitionTime":"2026-03-21T04:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.401194 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.401225 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.401235 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.401256 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.401265 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:07Z","lastTransitionTime":"2026-03-21T04:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.452440 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.452458 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:07 crc kubenswrapper[4839]: E0321 04:25:07.452611 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:07 crc kubenswrapper[4839]: E0321 04:25:07.452694 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.503316 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.503364 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.503375 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.503392 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.503402 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:07Z","lastTransitionTime":"2026-03-21T04:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.605613 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.605662 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.605674 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.605691 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.605701 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:07Z","lastTransitionTime":"2026-03-21T04:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.707545 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.707596 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.707604 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.707618 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.707631 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:07Z","lastTransitionTime":"2026-03-21T04:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.813841 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.813876 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.813885 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.813900 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.813908 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:07Z","lastTransitionTime":"2026-03-21T04:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.916013 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.916099 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.916110 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.916127 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:07 crc kubenswrapper[4839]: I0321 04:25:07.916139 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:07Z","lastTransitionTime":"2026-03-21T04:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.018174 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.018225 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.018238 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.018256 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.018268 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:08Z","lastTransitionTime":"2026-03-21T04:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.064061 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.064209 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.064342 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.064371 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.066779 4839 generic.go:334] "Generic (PLEG): container finished" podID="e0848faa-daf7-4b62-a20f-36d92678db1d" containerID="6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07" exitCode=0 Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.066827 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" event={"ID":"e0848faa-daf7-4b62-a20f-36d92678db1d","Type":"ContainerDied","Data":"6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.068550 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sxs57" event={"ID":"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d","Type":"ContainerStarted","Data":"4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.068618 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sxs57" event={"ID":"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d","Type":"ContainerStarted","Data":"f2aca93f8be5ce0f5e7828afb1727002c39312da949d6da70bc8973ef7174727"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.072267 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.082174 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.089519 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.090100 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.091307 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.102497 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.115991 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.121111 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.121146 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.121157 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.121175 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.121188 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:08Z","lastTransitionTime":"2026-03-21T04:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.131656 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.141450 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.163794 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.170300 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.185412 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.195717 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.206264 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.215210 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.224220 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.224274 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.224285 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.224297 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.224334 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:08Z","lastTransitionTime":"2026-03-21T04:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.226383 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.236022 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.245597 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.257752 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.269357 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.276026 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.290111 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.298120 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.310866 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.320105 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.326023 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.326061 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.326073 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.326089 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.326099 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:08Z","lastTransitionTime":"2026-03-21T04:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.328989 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.338259 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.344525 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.352355 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.359128 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.428824 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.428861 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.428889 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.428912 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.428921 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:08Z","lastTransitionTime":"2026-03-21T04:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.452671 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:08 crc kubenswrapper[4839]: E0321 04:25:08.453435 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.531432 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.531465 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.531473 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.531503 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.531513 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:08Z","lastTransitionTime":"2026-03-21T04:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.633339 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.633378 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.633388 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.633402 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.633412 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:08Z","lastTransitionTime":"2026-03-21T04:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.736328 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.736369 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.736378 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.736393 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.736402 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:08Z","lastTransitionTime":"2026-03-21T04:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.838169 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.838206 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.838215 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.838231 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.838242 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:08Z","lastTransitionTime":"2026-03-21T04:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.941068 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.941107 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.941117 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.941133 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:08 crc kubenswrapper[4839]: I0321 04:25:08.941145 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:08Z","lastTransitionTime":"2026-03-21T04:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.044001 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.044033 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.044042 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.044055 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.044064 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:09Z","lastTransitionTime":"2026-03-21T04:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.073933 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.074005 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.078219 4839 generic.go:334] "Generic (PLEG): container finished" podID="e0848faa-daf7-4b62-a20f-36d92678db1d" containerID="9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3" exitCode=0 Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.078291 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" event={"ID":"e0848faa-daf7-4b62-a20f-36d92678db1d","Type":"ContainerDied","Data":"9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.096330 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.106955 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.124098 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.136305 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.147364 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.148757 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.148780 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.149486 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.149527 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.149541 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:09Z","lastTransitionTime":"2026-03-21T04:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.160242 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.171792 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.187390 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.204721 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.214132 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.223547 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.235707 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.243406 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.252356 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.252382 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.252391 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.252405 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.252413 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:09Z","lastTransitionTime":"2026-03-21T04:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.254760 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.271600 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.284920 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.301790 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.313727 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.327204 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.339728 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.349593 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.354456 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.354489 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.354500 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.354521 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.354533 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:09Z","lastTransitionTime":"2026-03-21T04:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.360312 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.370890 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.385185 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.394010 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.406100 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.418409 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.427182 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.452539 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.452701 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:09 crc kubenswrapper[4839]: E0321 04:25:09.452724 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:09 crc kubenswrapper[4839]: E0321 04:25:09.452908 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.458180 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.458214 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.458227 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.458244 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.458256 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:09Z","lastTransitionTime":"2026-03-21T04:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.562763 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.562867 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.562899 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.562936 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.562963 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:09Z","lastTransitionTime":"2026-03-21T04:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.665772 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.665808 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.665817 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.665830 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.665839 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:09Z","lastTransitionTime":"2026-03-21T04:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.769019 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.769059 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.769070 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.769088 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.769098 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:09Z","lastTransitionTime":"2026-03-21T04:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.871794 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.871990 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.872053 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.872119 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.872192 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:09Z","lastTransitionTime":"2026-03-21T04:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.975008 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.975051 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.975062 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.975078 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:09 crc kubenswrapper[4839]: I0321 04:25:09.975089 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:09Z","lastTransitionTime":"2026-03-21T04:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.078065 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.078099 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.078111 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.078126 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.078137 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:10Z","lastTransitionTime":"2026-03-21T04:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.083910 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" event={"ID":"e0848faa-daf7-4b62-a20f-36d92678db1d","Type":"ContainerStarted","Data":"2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464"} Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.102895 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.113352 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.132603 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.145930 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.158191 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.171395 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.181873 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.181912 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.181925 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.181942 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.181955 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:10Z","lastTransitionTime":"2026-03-21T04:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.182040 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.192624 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.202070 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.213927 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.226894 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.242586 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.257152 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.270507 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:10Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.284730 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.284777 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.284791 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.284811 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.284823 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:10Z","lastTransitionTime":"2026-03-21T04:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.387083 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.387131 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.387140 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.387155 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.387163 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:10Z","lastTransitionTime":"2026-03-21T04:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.452181 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:10 crc kubenswrapper[4839]: E0321 04:25:10.453111 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.489543 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.489601 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.489610 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.489625 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.489672 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:10Z","lastTransitionTime":"2026-03-21T04:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.591938 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.591975 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.591984 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.591999 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.592008 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:10Z","lastTransitionTime":"2026-03-21T04:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.694092 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.694133 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.694143 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.694157 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.694166 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:10Z","lastTransitionTime":"2026-03-21T04:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.796261 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.796305 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.796314 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.796328 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.796337 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:10Z","lastTransitionTime":"2026-03-21T04:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.899651 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.899743 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.899776 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.899806 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:10 crc kubenswrapper[4839]: I0321 04:25:10.899830 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:10Z","lastTransitionTime":"2026-03-21T04:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.002639 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.002717 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.002759 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.002801 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.002819 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:11Z","lastTransitionTime":"2026-03-21T04:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.089226 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.093056 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/0.log" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.099761 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172" exitCode=1 Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.099850 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.100754 4839 scope.go:117] "RemoveContainer" containerID="a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.106657 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.106704 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.106718 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.106737 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.106753 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:11Z","lastTransitionTime":"2026-03-21T04:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.112256 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.127435 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.140417 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.154065 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.168215 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.186181 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.197487 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.208676 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.208707 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.208716 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.208729 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.208738 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:11Z","lastTransitionTime":"2026-03-21T04:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.214940 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.228562 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.241026 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.253652 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.262168 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.276413 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.287111 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.299133 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.309860 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.310678 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.310707 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.310716 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.310730 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.310739 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:11Z","lastTransitionTime":"2026-03-21T04:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.322306 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.334216 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.343617 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.370943 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.393486 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.412323 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.412368 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.412380 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.412398 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.412411 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:11Z","lastTransitionTime":"2026-03-21T04:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.423116 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"message\\\":\\\"7\\\\nI0321 04:25:10.380763 6652 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:10.380776 6652 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380779 6652 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:10.380790 6652 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:10.380821 6652 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:10.380831 6652 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:25:10.380838 6652 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:10.380893 6652 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380983 6652 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.381323 6652 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.434599 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.444557 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.452035 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.452081 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:11 crc kubenswrapper[4839]: E0321 04:25:11.452114 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:11 crc kubenswrapper[4839]: E0321 04:25:11.452208 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.455179 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.465003 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.474582 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.484465 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:11Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.518936 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.518984 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.519016 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.519035 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.519045 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:11Z","lastTransitionTime":"2026-03-21T04:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.621275 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.621313 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.621325 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.621341 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.621352 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:11Z","lastTransitionTime":"2026-03-21T04:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.724001 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.724047 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.724063 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.724087 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.724104 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:11Z","lastTransitionTime":"2026-03-21T04:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.825856 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.825934 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.825947 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.825962 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.825974 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:11Z","lastTransitionTime":"2026-03-21T04:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.937363 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.937390 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.937399 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.937413 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:11 crc kubenswrapper[4839]: I0321 04:25:11.937421 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:11Z","lastTransitionTime":"2026-03-21T04:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.039908 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.039943 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.039955 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.039971 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.039984 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:12Z","lastTransitionTime":"2026-03-21T04:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.105457 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/1.log" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.106627 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/0.log" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.109456 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63" exitCode=1 Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.109511 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.109542 4839 scope.go:117] "RemoveContainer" containerID="a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.110237 4839 scope.go:117] "RemoveContainer" containerID="4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63" Mar 21 04:25:12 crc kubenswrapper[4839]: E0321 04:25:12.110367 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.113425 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.131932 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.141182 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.142013 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.142042 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.142055 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.142071 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.142081 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:12Z","lastTransitionTime":"2026-03-21T04:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.153106 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.164843 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.178093 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.189599 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.199339 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.220755 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.235787 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.244639 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.244669 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.244678 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.244691 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.244699 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:12Z","lastTransitionTime":"2026-03-21T04:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.258004 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"message\\\":\\\"7\\\\nI0321 04:25:10.380763 6652 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:10.380776 6652 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380779 6652 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:10.380790 6652 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:10.380821 6652 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:10.380831 6652 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:25:10.380838 6652 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:10.380893 6652 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380983 6652 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.381323 6652 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.268863 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.279431 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.290305 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.298429 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.317904 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.330188 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.346667 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.346702 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.346712 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.346728 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.346740 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:12Z","lastTransitionTime":"2026-03-21T04:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.346906 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"message\\\":\\\"7\\\\nI0321 04:25:10.380763 6652 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:10.380776 6652 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380779 6652 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:10.380790 6652 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:10.380821 6652 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:10.380831 6652 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:25:10.380838 6652 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:10.380893 6652 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380983 6652 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.381323 6652 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.358613 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.368193 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.379143 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.389666 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.401321 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.411365 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.423261 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.433659 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.447403 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.448658 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.448696 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.448709 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.448735 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.448748 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:12Z","lastTransitionTime":"2026-03-21T04:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.451778 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:12 crc kubenswrapper[4839]: E0321 04:25:12.451901 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.459122 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.466781 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.550512 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.550545 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.550556 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.550586 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.550597 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:12Z","lastTransitionTime":"2026-03-21T04:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.591194 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57"] Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.591733 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.593731 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.594207 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.605506 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.617817 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.630239 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.640776 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.642311 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n8pc\" (UniqueName: \"kubernetes.io/projected/4dee692e-c3b8-4538-86d7-210dd7e96173-kube-api-access-6n8pc\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.642381 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4dee692e-c3b8-4538-86d7-210dd7e96173-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.642446 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4dee692e-c3b8-4538-86d7-210dd7e96173-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.642469 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4dee692e-c3b8-4538-86d7-210dd7e96173-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.652703 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.652745 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.652754 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.652768 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.652777 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:12Z","lastTransitionTime":"2026-03-21T04:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.655514 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.666745 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.677723 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.693879 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.703448 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.719449 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"message\\\":\\\"7\\\\nI0321 04:25:10.380763 6652 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:10.380776 6652 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380779 6652 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:10.380790 6652 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:10.380821 6652 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:10.380831 6652 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:25:10.380838 6652 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:10.380893 6652 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380983 6652 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.381323 6652 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.729375 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.743764 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n8pc\" (UniqueName: \"kubernetes.io/projected/4dee692e-c3b8-4538-86d7-210dd7e96173-kube-api-access-6n8pc\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.743840 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4dee692e-c3b8-4538-86d7-210dd7e96173-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.743890 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4dee692e-c3b8-4538-86d7-210dd7e96173-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.743912 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4dee692e-c3b8-4538-86d7-210dd7e96173-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.744650 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4dee692e-c3b8-4538-86d7-210dd7e96173-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.744935 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4dee692e-c3b8-4538-86d7-210dd7e96173-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.746396 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.752936 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4dee692e-c3b8-4538-86d7-210dd7e96173-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.754927 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.754953 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.754961 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.754975 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.754983 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:12Z","lastTransitionTime":"2026-03-21T04:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.759539 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n8pc\" (UniqueName: \"kubernetes.io/projected/4dee692e-c3b8-4538-86d7-210dd7e96173-kube-api-access-6n8pc\") pod \"ovnkube-control-plane-749d76644c-9hl57\" (UID: \"4dee692e-c3b8-4538-86d7-210dd7e96173\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.763285 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.775951 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.786494 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:12Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.857508 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.857540 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.857549 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.857562 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.857584 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:12Z","lastTransitionTime":"2026-03-21T04:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.904299 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" Mar 21 04:25:12 crc kubenswrapper[4839]: W0321 04:25:12.918455 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dee692e_c3b8_4538_86d7_210dd7e96173.slice/crio-e3778dcf037c8735baaeaca27f2f08bc7a560ecd9e2017dbe012d3156a6e515d WatchSource:0}: Error finding container e3778dcf037c8735baaeaca27f2f08bc7a560ecd9e2017dbe012d3156a6e515d: Status 404 returned error can't find the container with id e3778dcf037c8735baaeaca27f2f08bc7a560ecd9e2017dbe012d3156a6e515d Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.960015 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.960070 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.960083 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.960105 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:12 crc kubenswrapper[4839]: I0321 04:25:12.960123 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:12Z","lastTransitionTime":"2026-03-21T04:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.062213 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.062265 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.062274 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.062290 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.062299 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:13Z","lastTransitionTime":"2026-03-21T04:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.121874 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/1.log" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.125346 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" event={"ID":"4dee692e-c3b8-4538-86d7-210dd7e96173","Type":"ContainerStarted","Data":"e3778dcf037c8735baaeaca27f2f08bc7a560ecd9e2017dbe012d3156a6e515d"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.164441 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.164478 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.164491 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.164506 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.164518 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:13Z","lastTransitionTime":"2026-03-21T04:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.248602 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.248719 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.248778 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.248950 4839 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.249020 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:45.249004347 +0000 UTC m=+149.576791023 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.249112 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:25:45.24910251 +0000 UTC m=+149.576889186 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.249179 4839 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.249219 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:45.249209323 +0000 UTC m=+149.576995999 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.266483 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.266542 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.266560 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.266604 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.266619 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:13Z","lastTransitionTime":"2026-03-21T04:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.328624 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-445ww"] Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.329054 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.329103 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.343649 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.349897 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b26h\" (UniqueName: \"kubernetes.io/projected/fa13ce27-53f2-4178-8560-251f0bb3f034-kube-api-access-4b26h\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.349940 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.349980 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.350015 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.350152 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.350183 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.350180 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.350222 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.350233 4839 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.350280 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:45.350265113 +0000 UTC m=+149.678051789 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.350196 4839 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.350333 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:45.350319195 +0000 UTC m=+149.678105861 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.354690 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.367432 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.369865 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.369908 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.369919 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.369940 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.369951 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:13Z","lastTransitionTime":"2026-03-21T04:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.380916 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.394945 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.407134 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.416163 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.433906 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.450035 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.451735 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.451778 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.451851 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.451980 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.452353 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b26h\" (UniqueName: \"kubernetes.io/projected/fa13ce27-53f2-4178-8560-251f0bb3f034-kube-api-access-4b26h\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.452391 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.452524 4839 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.452581 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs podName:fa13ce27-53f2-4178-8560-251f0bb3f034 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:13.952556432 +0000 UTC m=+118.280343108 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs") pod "network-metrics-daemon-445ww" (UID: "fa13ce27-53f2-4178-8560-251f0bb3f034") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.466319 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"message\\\":\\\"7\\\\nI0321 04:25:10.380763 6652 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:10.380776 6652 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380779 6652 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:10.380790 6652 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:10.380821 6652 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:10.380831 6652 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:25:10.380838 6652 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:10.380893 6652 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380983 6652 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.381323 6652 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.467364 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b26h\" (UniqueName: \"kubernetes.io/projected/fa13ce27-53f2-4178-8560-251f0bb3f034-kube-api-access-4b26h\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.472431 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.472459 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.472471 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.472487 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.472497 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:13Z","lastTransitionTime":"2026-03-21T04:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.476948 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.486414 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.503932 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.515847 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.531903 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.544471 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:13Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.576885 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.577187 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.577296 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.577414 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.577528 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:13Z","lastTransitionTime":"2026-03-21T04:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.679901 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.679930 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.679940 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.679955 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.679966 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:13Z","lastTransitionTime":"2026-03-21T04:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.781973 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.782048 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.782070 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.782099 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.782120 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:13Z","lastTransitionTime":"2026-03-21T04:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.885242 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.885278 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.885294 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.885308 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.885318 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:13Z","lastTransitionTime":"2026-03-21T04:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.956775 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.956948 4839 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: E0321 04:25:13.957010 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs podName:fa13ce27-53f2-4178-8560-251f0bb3f034 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:14.956993072 +0000 UTC m=+119.284779748 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs") pod "network-metrics-daemon-445ww" (UID: "fa13ce27-53f2-4178-8560-251f0bb3f034") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.987074 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.987116 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.987125 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.987138 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:13 crc kubenswrapper[4839]: I0321 04:25:13.987147 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:13Z","lastTransitionTime":"2026-03-21T04:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.089539 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.089608 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.089619 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.089636 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.089649 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.129785 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" event={"ID":"4dee692e-c3b8-4538-86d7-210dd7e96173","Type":"ContainerStarted","Data":"99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.129824 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" event={"ID":"4dee692e-c3b8-4538-86d7-210dd7e96173","Type":"ContainerStarted","Data":"6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.142745 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.157520 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.176551 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.188624 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.192150 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.192190 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.192201 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.192215 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.192225 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.201760 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.212981 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.221987 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.234819 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.267437 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.294640 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.295132 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.295187 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.295199 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.295217 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.295229 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.308350 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.318041 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.334373 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.344034 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.359362 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"message\\\":\\\"7\\\\nI0321 04:25:10.380763 6652 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:10.380776 6652 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380779 6652 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:10.380790 6652 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:10.380821 6652 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:10.380831 6652 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:25:10.380838 6652 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:10.380893 6652 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380983 6652 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.381323 6652 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.369945 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.397550 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.397624 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.397634 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.397649 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.397658 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.451975 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:14 crc kubenswrapper[4839]: E0321 04:25:14.452092 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.500012 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.500079 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.500095 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.500119 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.500134 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.529610 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.529653 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.529662 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.529679 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.529690 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: E0321 04:25:14.540467 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.543546 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.543600 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.543610 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.543623 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.543632 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: E0321 04:25:14.553650 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.557051 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.557076 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.557086 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.557142 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.557154 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: E0321 04:25:14.571361 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.575062 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.575111 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.575122 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.575140 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.575157 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: E0321 04:25:14.588870 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.592999 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.593050 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.593065 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.593092 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.593107 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: E0321 04:25:14.605908 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:14Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:14 crc kubenswrapper[4839]: E0321 04:25:14.606041 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.608224 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.608263 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.608273 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.608292 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.608306 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.711173 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.711226 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.711238 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.711258 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.711269 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.813543 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.813622 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.813638 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.813654 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.813668 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.915892 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.915961 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.915991 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.916006 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.916016 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:14Z","lastTransitionTime":"2026-03-21T04:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:14 crc kubenswrapper[4839]: I0321 04:25:14.967058 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:14 crc kubenswrapper[4839]: E0321 04:25:14.967266 4839 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:14 crc kubenswrapper[4839]: E0321 04:25:14.967330 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs podName:fa13ce27-53f2-4178-8560-251f0bb3f034 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:16.967310385 +0000 UTC m=+121.295097071 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs") pod "network-metrics-daemon-445ww" (UID: "fa13ce27-53f2-4178-8560-251f0bb3f034") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.017917 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.017957 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.017965 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.017979 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.017988 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:15Z","lastTransitionTime":"2026-03-21T04:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.119960 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.119997 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.120006 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.120020 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.120029 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:15Z","lastTransitionTime":"2026-03-21T04:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.222194 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.222227 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.222235 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.222248 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.222257 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:15Z","lastTransitionTime":"2026-03-21T04:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.325465 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.325783 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.325794 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.325809 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.325820 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:15Z","lastTransitionTime":"2026-03-21T04:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.428105 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.428142 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.428153 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.428168 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.428178 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:15Z","lastTransitionTime":"2026-03-21T04:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.452324 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.452338 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:15 crc kubenswrapper[4839]: E0321 04:25:15.452473 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:15 crc kubenswrapper[4839]: E0321 04:25:15.452607 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.452794 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:15 crc kubenswrapper[4839]: E0321 04:25:15.453049 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.530624 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.530766 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.530802 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.530834 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.530855 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:15Z","lastTransitionTime":"2026-03-21T04:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.633543 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.633618 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.633630 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.633650 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.633666 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:15Z","lastTransitionTime":"2026-03-21T04:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.736203 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.736247 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.736260 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.736281 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.736293 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:15Z","lastTransitionTime":"2026-03-21T04:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.838546 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.838633 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.838651 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.838676 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.838692 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:15Z","lastTransitionTime":"2026-03-21T04:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.941755 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.941793 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.941830 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.941848 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:15 crc kubenswrapper[4839]: I0321 04:25:15.941860 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:15Z","lastTransitionTime":"2026-03-21T04:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.045032 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.045096 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.045113 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.045138 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.045156 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:16Z","lastTransitionTime":"2026-03-21T04:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.150445 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.150496 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.150509 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.150527 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.150541 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:16Z","lastTransitionTime":"2026-03-21T04:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.253424 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.253469 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.253482 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.253500 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.253514 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:16Z","lastTransitionTime":"2026-03-21T04:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.355442 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.355477 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.355486 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.355498 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.355507 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:16Z","lastTransitionTime":"2026-03-21T04:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.451870 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:16 crc kubenswrapper[4839]: E0321 04:25:16.452761 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:16 crc kubenswrapper[4839]: E0321 04:25:16.455660 4839 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.464614 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.478979 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.490342 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.505146 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.517282 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.527798 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: E0321 04:25:16.535927 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.547752 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.558270 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.577728 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"message\\\":\\\"7\\\\nI0321 04:25:10.380763 6652 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:10.380776 6652 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380779 6652 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:10.380790 6652 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:10.380821 6652 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:10.380831 6652 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:25:10.380838 6652 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:10.380893 6652 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380983 6652 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.381323 6652 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.587805 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.596744 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.607097 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.617683 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.627956 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.640499 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.650908 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:16 crc kubenswrapper[4839]: I0321 04:25:16.985875 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:16 crc kubenswrapper[4839]: E0321 04:25:16.986041 4839 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:16 crc kubenswrapper[4839]: E0321 04:25:16.986137 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs podName:fa13ce27-53f2-4178-8560-251f0bb3f034 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:20.986112183 +0000 UTC m=+125.313898899 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs") pod "network-metrics-daemon-445ww" (UID: "fa13ce27-53f2-4178-8560-251f0bb3f034") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.451932 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.451952 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.451972 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:17 crc kubenswrapper[4839]: E0321 04:25:17.452401 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:17 crc kubenswrapper[4839]: E0321 04:25:17.452435 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:17 crc kubenswrapper[4839]: E0321 04:25:17.452256 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.775224 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.791163 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.803702 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.829618 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.847153 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.867480 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5c1d96269996b1709d6a1c4744c11ca23d8e4fd56bc740881a1ddb1f0198172\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"message\\\":\\\"7\\\\nI0321 04:25:10.380763 6652 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:10.380776 6652 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380779 6652 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:10.380790 6652 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:10.380821 6652 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:10.380831 6652 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0321 04:25:10.380838 6652 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:10.380893 6652 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.380983 6652 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0321 04:25:10.381323 6652 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.881145 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.892550 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.907667 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.921702 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.933253 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.946435 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.959828 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.972846 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:17 crc kubenswrapper[4839]: I0321 04:25:17.988288 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:17Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:18 crc kubenswrapper[4839]: I0321 04:25:18.002595 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:18 crc kubenswrapper[4839]: I0321 04:25:18.017754 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:18Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:18 crc kubenswrapper[4839]: I0321 04:25:18.452544 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:18 crc kubenswrapper[4839]: E0321 04:25:18.452778 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:19 crc kubenswrapper[4839]: I0321 04:25:19.452768 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:19 crc kubenswrapper[4839]: I0321 04:25:19.452773 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:19 crc kubenswrapper[4839]: E0321 04:25:19.452893 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:19 crc kubenswrapper[4839]: I0321 04:25:19.452781 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:19 crc kubenswrapper[4839]: E0321 04:25:19.453048 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:19 crc kubenswrapper[4839]: E0321 04:25:19.453167 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:20 crc kubenswrapper[4839]: I0321 04:25:20.451858 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:20 crc kubenswrapper[4839]: E0321 04:25:20.452236 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:21 crc kubenswrapper[4839]: I0321 04:25:21.028317 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:21 crc kubenswrapper[4839]: E0321 04:25:21.028426 4839 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:21 crc kubenswrapper[4839]: E0321 04:25:21.029042 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs podName:fa13ce27-53f2-4178-8560-251f0bb3f034 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:29.029022604 +0000 UTC m=+133.356809280 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs") pod "network-metrics-daemon-445ww" (UID: "fa13ce27-53f2-4178-8560-251f0bb3f034") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:21 crc kubenswrapper[4839]: I0321 04:25:21.452303 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:21 crc kubenswrapper[4839]: I0321 04:25:21.452345 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:21 crc kubenswrapper[4839]: E0321 04:25:21.452487 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:21 crc kubenswrapper[4839]: E0321 04:25:21.452663 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:21 crc kubenswrapper[4839]: I0321 04:25:21.453002 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:21 crc kubenswrapper[4839]: E0321 04:25:21.453372 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:21 crc kubenswrapper[4839]: E0321 04:25:21.537519 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:25:22 crc kubenswrapper[4839]: I0321 04:25:22.452443 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:22 crc kubenswrapper[4839]: E0321 04:25:22.452741 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:23 crc kubenswrapper[4839]: I0321 04:25:23.452777 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:23 crc kubenswrapper[4839]: E0321 04:25:23.452949 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:23 crc kubenswrapper[4839]: I0321 04:25:23.452800 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:23 crc kubenswrapper[4839]: E0321 04:25:23.453031 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:23 crc kubenswrapper[4839]: I0321 04:25:23.453072 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:23 crc kubenswrapper[4839]: E0321 04:25:23.453118 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.452380 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:24 crc kubenswrapper[4839]: E0321 04:25:24.452643 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.809525 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.809613 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.809631 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.809657 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.809682 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:24Z","lastTransitionTime":"2026-03-21T04:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:24 crc kubenswrapper[4839]: E0321 04:25:24.830342 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:24Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.834467 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.834516 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.834530 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.834553 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.834597 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:24Z","lastTransitionTime":"2026-03-21T04:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:24 crc kubenswrapper[4839]: E0321 04:25:24.848249 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:24Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.852129 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.852168 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.852180 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.852197 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.852209 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:24Z","lastTransitionTime":"2026-03-21T04:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:24 crc kubenswrapper[4839]: E0321 04:25:24.867287 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:24Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.871679 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.871716 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.871728 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.871745 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.871760 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:24Z","lastTransitionTime":"2026-03-21T04:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:24 crc kubenswrapper[4839]: E0321 04:25:24.884379 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:24Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.888460 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.888554 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.888596 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.888620 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:24 crc kubenswrapper[4839]: I0321 04:25:24.888638 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:24Z","lastTransitionTime":"2026-03-21T04:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:24 crc kubenswrapper[4839]: E0321 04:25:24.900621 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:24Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:24 crc kubenswrapper[4839]: E0321 04:25:24.900728 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.452349 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.452349 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.452351 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:25 crc kubenswrapper[4839]: E0321 04:25:25.452761 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:25 crc kubenswrapper[4839]: E0321 04:25:25.452812 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:25 crc kubenswrapper[4839]: E0321 04:25:25.452841 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.452880 4839 scope.go:117] "RemoveContainer" containerID="4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.467328 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.481323 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.494660 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.511410 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.523231 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.544974 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.555784 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.576433 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.587635 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.597893 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.612756 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.631623 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.643809 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.651423 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.661965 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:25 crc kubenswrapper[4839]: I0321 04:25:25.671239 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:25Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.178131 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/1.log" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.181526 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2"} Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.182367 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.195263 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.207040 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.223423 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.237070 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.248051 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.267088 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.278497 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.294314 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.304779 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.313087 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.323802 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.334967 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.347052 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.357556 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.369662 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.386727 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.452397 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:26 crc kubenswrapper[4839]: E0321 04:25:26.452604 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.479989 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.497555 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.512532 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.533542 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: E0321 04:25:26.538256 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.550835 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.565604 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.579831 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.591299 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.606727 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.621398 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.634348 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.649725 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.659998 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.668522 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.679304 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:26 crc kubenswrapper[4839]: I0321 04:25:26.690365 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.186136 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/2.log" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.186942 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/1.log" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.190328 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2" exitCode=1 Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.190391 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2"} Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.190437 4839 scope.go:117] "RemoveContainer" containerID="4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.190898 4839 scope.go:117] "RemoveContainer" containerID="c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2" Mar 21 04:25:27 crc kubenswrapper[4839]: E0321 04:25:27.191042 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.218069 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f8906b09cbadb32578e0445be63ed6c1e3ce446f301f92ae844664327d19a63\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"message\\\":\\\"al\\\\nI0321 04:25:12.025879 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0321 04:25:12.025892 6856 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0321 04:25:12.025912 6856 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0321 04:25:12.025931 6856 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0321 04:25:12.025958 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0321 04:25:12.025968 6856 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0321 04:25:12.025975 6856 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0321 04:25:12.025945 6856 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0321 04:25:12.025953 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0321 04:25:12.025985 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0321 04:25:12.026002 6856 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0321 04:25:12.026060 6856 handler.go:208] Removed *v1.Node event handler 2\\\\nI0321 04:25:12.026070 6856 handler.go:208] Removed *v1.Node event handler 7\\\\nI0321 04:25:12.026092 6856 factory.go:656] Stopping watch factory\\\\nI0321 04:25:12.026121 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0321 04:25:12.026162 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0321 04:25:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:26Z\\\",\\\"message\\\":\\\"e/networking-console-plugin-85b44fc459-gdk6g openshift-etcd/etcd-crc openshift-multus/multus-zqcw4 openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57 openshift-multus/network-metrics-daemon-445ww openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb]\\\\nI0321 04:25:26.224924 7118 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0321 04:25:26.224943 7118 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224952 7118 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224964 7118 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0321 04:25:26.224967 7118 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer becau\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.231552 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.244231 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.266199 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.280704 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.296361 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.312916 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.327461 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.341164 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.357194 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.370340 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.388434 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.404741 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.416798 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.434888 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.449941 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:27Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.452131 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.452162 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:27 crc kubenswrapper[4839]: I0321 04:25:27.452230 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:27 crc kubenswrapper[4839]: E0321 04:25:27.452332 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:27 crc kubenswrapper[4839]: E0321 04:25:27.452414 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:27 crc kubenswrapper[4839]: E0321 04:25:27.452504 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.195593 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/2.log" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.198492 4839 scope.go:117] "RemoveContainer" containerID="c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2" Mar 21 04:25:28 crc kubenswrapper[4839]: E0321 04:25:28.198662 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.209640 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.222599 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.236987 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.250050 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.261237 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.270637 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.286591 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.298383 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.312595 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.332630 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.347335 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.367009 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:26Z\\\",\\\"message\\\":\\\"e/networking-console-plugin-85b44fc459-gdk6g openshift-etcd/etcd-crc openshift-multus/multus-zqcw4 openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57 openshift-multus/network-metrics-daemon-445ww openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb]\\\\nI0321 04:25:26.224924 7118 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0321 04:25:26.224943 7118 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224952 7118 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224964 7118 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0321 04:25:26.224967 7118 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer becau\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.379279 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.391300 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.414132 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.431722 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:28Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:28 crc kubenswrapper[4839]: I0321 04:25:28.452176 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:28 crc kubenswrapper[4839]: E0321 04:25:28.452511 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:29 crc kubenswrapper[4839]: I0321 04:25:29.109291 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:29 crc kubenswrapper[4839]: E0321 04:25:29.109514 4839 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:29 crc kubenswrapper[4839]: E0321 04:25:29.109623 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs podName:fa13ce27-53f2-4178-8560-251f0bb3f034 nodeName:}" failed. No retries permitted until 2026-03-21 04:25:45.109599733 +0000 UTC m=+149.437386439 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs") pod "network-metrics-daemon-445ww" (UID: "fa13ce27-53f2-4178-8560-251f0bb3f034") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:29 crc kubenswrapper[4839]: I0321 04:25:29.452631 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:29 crc kubenswrapper[4839]: I0321 04:25:29.452670 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:29 crc kubenswrapper[4839]: E0321 04:25:29.452833 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:29 crc kubenswrapper[4839]: I0321 04:25:29.452670 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:29 crc kubenswrapper[4839]: E0321 04:25:29.453124 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:29 crc kubenswrapper[4839]: E0321 04:25:29.453186 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:30 crc kubenswrapper[4839]: I0321 04:25:30.465002 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:30 crc kubenswrapper[4839]: E0321 04:25:30.465221 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:31 crc kubenswrapper[4839]: I0321 04:25:31.452668 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:31 crc kubenswrapper[4839]: I0321 04:25:31.452702 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:31 crc kubenswrapper[4839]: E0321 04:25:31.452793 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:31 crc kubenswrapper[4839]: E0321 04:25:31.452920 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:31 crc kubenswrapper[4839]: I0321 04:25:31.452958 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:31 crc kubenswrapper[4839]: E0321 04:25:31.453036 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:31 crc kubenswrapper[4839]: E0321 04:25:31.539627 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:25:32 crc kubenswrapper[4839]: I0321 04:25:32.452653 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:32 crc kubenswrapper[4839]: E0321 04:25:32.453068 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:33 crc kubenswrapper[4839]: I0321 04:25:33.452722 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:33 crc kubenswrapper[4839]: I0321 04:25:33.452724 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:33 crc kubenswrapper[4839]: E0321 04:25:33.452865 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:33 crc kubenswrapper[4839]: E0321 04:25:33.453025 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:33 crc kubenswrapper[4839]: I0321 04:25:33.453919 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:33 crc kubenswrapper[4839]: E0321 04:25:33.454162 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.452116 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:34 crc kubenswrapper[4839]: E0321 04:25:34.452237 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.468996 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.953415 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.953487 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.953506 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.953538 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.953559 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:34Z","lastTransitionTime":"2026-03-21T04:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:34 crc kubenswrapper[4839]: E0321 04:25:34.977180 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:34Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.983754 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.983814 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.983831 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.983856 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:34 crc kubenswrapper[4839]: I0321 04:25:34.983873 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:34Z","lastTransitionTime":"2026-03-21T04:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:35 crc kubenswrapper[4839]: E0321 04:25:35.006040 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.011546 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.011622 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.011643 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.011671 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.011690 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:35Z","lastTransitionTime":"2026-03-21T04:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:35 crc kubenswrapper[4839]: E0321 04:25:35.029610 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.034717 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.034771 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.034785 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.034869 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.034895 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:35Z","lastTransitionTime":"2026-03-21T04:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:35 crc kubenswrapper[4839]: E0321 04:25:35.053125 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.058037 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.058104 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.058122 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.058150 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.058168 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:35Z","lastTransitionTime":"2026-03-21T04:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:35 crc kubenswrapper[4839]: E0321 04:25:35.077664 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:35Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:35 crc kubenswrapper[4839]: E0321 04:25:35.077895 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.452104 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.452143 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:35 crc kubenswrapper[4839]: E0321 04:25:35.452239 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:35 crc kubenswrapper[4839]: I0321 04:25:35.452156 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:35 crc kubenswrapper[4839]: E0321 04:25:35.452673 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:35 crc kubenswrapper[4839]: E0321 04:25:35.452762 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.452325 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:36 crc kubenswrapper[4839]: E0321 04:25:36.452515 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.471075 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.488352 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.523764 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: E0321 04:25:36.540443 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.543668 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.584102 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:26Z\\\",\\\"message\\\":\\\"e/networking-console-plugin-85b44fc459-gdk6g openshift-etcd/etcd-crc openshift-multus/multus-zqcw4 openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57 openshift-multus/network-metrics-daemon-445ww openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb]\\\\nI0321 04:25:26.224924 7118 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0321 04:25:26.224943 7118 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224952 7118 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224964 7118 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0321 04:25:26.224967 7118 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer becau\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.606480 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.626917 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.650775 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.665535 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.678662 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.691264 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.707554 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.725019 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.740715 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.760511 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.776200 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:36 crc kubenswrapper[4839]: I0321 04:25:36.789532 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:36Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:37 crc kubenswrapper[4839]: I0321 04:25:37.452558 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:37 crc kubenswrapper[4839]: I0321 04:25:37.452707 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:37 crc kubenswrapper[4839]: E0321 04:25:37.452748 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:37 crc kubenswrapper[4839]: E0321 04:25:37.452837 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:37 crc kubenswrapper[4839]: I0321 04:25:37.453366 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:37 crc kubenswrapper[4839]: E0321 04:25:37.453525 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:38 crc kubenswrapper[4839]: I0321 04:25:38.452609 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:38 crc kubenswrapper[4839]: E0321 04:25:38.452778 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:39 crc kubenswrapper[4839]: I0321 04:25:39.452555 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:39 crc kubenswrapper[4839]: I0321 04:25:39.452728 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:39 crc kubenswrapper[4839]: E0321 04:25:39.453358 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:39 crc kubenswrapper[4839]: I0321 04:25:39.452733 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:39 crc kubenswrapper[4839]: E0321 04:25:39.453498 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:39 crc kubenswrapper[4839]: E0321 04:25:39.454692 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:39 crc kubenswrapper[4839]: I0321 04:25:39.469758 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 21 04:25:40 crc kubenswrapper[4839]: I0321 04:25:40.451853 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:40 crc kubenswrapper[4839]: E0321 04:25:40.451975 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:41 crc kubenswrapper[4839]: I0321 04:25:41.452794 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:41 crc kubenswrapper[4839]: I0321 04:25:41.452864 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:41 crc kubenswrapper[4839]: E0321 04:25:41.453890 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:41 crc kubenswrapper[4839]: E0321 04:25:41.454021 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:41 crc kubenswrapper[4839]: I0321 04:25:41.452907 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:41 crc kubenswrapper[4839]: E0321 04:25:41.454106 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:41 crc kubenswrapper[4839]: E0321 04:25:41.542163 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:25:42 crc kubenswrapper[4839]: I0321 04:25:42.452464 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:42 crc kubenswrapper[4839]: E0321 04:25:42.453176 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:42 crc kubenswrapper[4839]: I0321 04:25:42.453871 4839 scope.go:117] "RemoveContainer" containerID="c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2" Mar 21 04:25:42 crc kubenswrapper[4839]: E0321 04:25:42.454141 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" Mar 21 04:25:43 crc kubenswrapper[4839]: I0321 04:25:43.452450 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:43 crc kubenswrapper[4839]: E0321 04:25:43.452651 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:43 crc kubenswrapper[4839]: I0321 04:25:43.452466 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:43 crc kubenswrapper[4839]: E0321 04:25:43.452743 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:43 crc kubenswrapper[4839]: I0321 04:25:43.452466 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:43 crc kubenswrapper[4839]: E0321 04:25:43.452811 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:44 crc kubenswrapper[4839]: I0321 04:25:44.452892 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:44 crc kubenswrapper[4839]: E0321 04:25:44.453083 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.179146 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.179407 4839 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.179532 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs podName:fa13ce27-53f2-4178-8560-251f0bb3f034 nodeName:}" failed. No retries permitted until 2026-03-21 04:26:17.179499645 +0000 UTC m=+181.507286491 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs") pod "network-metrics-daemon-445ww" (UID: "fa13ce27-53f2-4178-8560-251f0bb3f034") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.280791 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.280995 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:26:49.280947992 +0000 UTC m=+213.608734678 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.281074 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.281160 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.281240 4839 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.281307 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:26:49.281292273 +0000 UTC m=+213.609078949 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.281373 4839 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.281505 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:26:49.281461188 +0000 UTC m=+213.609248064 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.382905 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.383033 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.383140 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.383181 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.383195 4839 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.383254 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:26:49.383237285 +0000 UTC m=+213.711023961 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.383297 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.383326 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.383343 4839 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.383452 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:26:49.3834009 +0000 UTC m=+213.711187756 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.452349 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.452486 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.453168 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.452513 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.453271 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.453391 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.470834 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.470903 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.470920 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.470948 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.470966 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:45Z","lastTransitionTime":"2026-03-21T04:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.485703 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.489524 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.489592 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.489602 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.489618 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.489627 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:45Z","lastTransitionTime":"2026-03-21T04:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.502702 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.506653 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.506733 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.506750 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.506780 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.506793 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:45Z","lastTransitionTime":"2026-03-21T04:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.518375 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.522611 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.522658 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.522672 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.522689 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.522700 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:45Z","lastTransitionTime":"2026-03-21T04:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.536228 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.541556 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.541670 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.541698 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.541735 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:45 crc kubenswrapper[4839]: I0321 04:25:45.541760 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:45Z","lastTransitionTime":"2026-03-21T04:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.555281 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:45Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:45 crc kubenswrapper[4839]: E0321 04:25:45.555430 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.452105 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:46 crc kubenswrapper[4839]: E0321 04:25:46.452247 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.463003 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.472682 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.495825 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.508804 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.527551 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:26Z\\\",\\\"message\\\":\\\"e/networking-console-plugin-85b44fc459-gdk6g openshift-etcd/etcd-crc openshift-multus/multus-zqcw4 openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57 openshift-multus/network-metrics-daemon-445ww openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb]\\\\nI0321 04:25:26.224924 7118 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0321 04:25:26.224943 7118 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224952 7118 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224964 7118 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0321 04:25:26.224967 7118 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer becau\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.542607 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: E0321 04:25:46.543327 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.557347 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.569215 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99628ae3-9af9-4946-8e9f-2fc369d82cfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:23:48.352735 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:23:48.353466 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:23:48.354210 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:23:48.354819 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:24:09.734400 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0321 04:24:17.259238 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:24:17.259294 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:48Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.581270 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.594360 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.605789 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.622108 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.635334 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.647967 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.658965 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.672202 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.683191 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:46 crc kubenswrapper[4839]: I0321 04:25:46.697289 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:46Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:47 crc kubenswrapper[4839]: I0321 04:25:47.452014 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:47 crc kubenswrapper[4839]: I0321 04:25:47.452080 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:47 crc kubenswrapper[4839]: I0321 04:25:47.452014 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:47 crc kubenswrapper[4839]: E0321 04:25:47.452161 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:47 crc kubenswrapper[4839]: E0321 04:25:47.452210 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:47 crc kubenswrapper[4839]: E0321 04:25:47.452301 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.265040 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/0.log" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.265094 4839 generic.go:334] "Generic (PLEG): container finished" podID="1602189b-f4f3-40ee-ba63-c695c11069d0" containerID="abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747" exitCode=1 Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.265122 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqcw4" event={"ID":"1602189b-f4f3-40ee-ba63-c695c11069d0","Type":"ContainerDied","Data":"abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747"} Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.265473 4839 scope.go:117] "RemoveContainer" containerID="abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.284063 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.296541 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.313613 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:26Z\\\",\\\"message\\\":\\\"e/networking-console-plugin-85b44fc459-gdk6g openshift-etcd/etcd-crc openshift-multus/multus-zqcw4 openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57 openshift-multus/network-metrics-daemon-445ww openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb]\\\\nI0321 04:25:26.224924 7118 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0321 04:25:26.224943 7118 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224952 7118 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224964 7118 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0321 04:25:26.224967 7118 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer becau\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.325631 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.335865 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.350551 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99628ae3-9af9-4946-8e9f-2fc369d82cfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:23:48.352735 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:23:48.353466 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:23:48.354210 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:23:48.354819 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:24:09.734400 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0321 04:24:17.259238 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:24:17.259294 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:48Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.364676 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.375712 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.386720 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.399882 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.411757 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.423764 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.433909 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.445812 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.452509 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:48 crc kubenswrapper[4839]: E0321 04:25:48.452616 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.458129 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.470355 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.480535 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:48Z\\\",\\\"message\\\":\\\"2026-03-21T04:25:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413\\\\n2026-03-21T04:25:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413 to /host/opt/cni/bin/\\\\n2026-03-21T04:25:02Z [verbose] multus-daemon started\\\\n2026-03-21T04:25:02Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:25:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:48 crc kubenswrapper[4839]: I0321 04:25:48.490548 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:48Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.269678 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/0.log" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.269733 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqcw4" event={"ID":"1602189b-f4f3-40ee-ba63-c695c11069d0","Type":"ContainerStarted","Data":"bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1"} Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.281525 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.294990 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99628ae3-9af9-4946-8e9f-2fc369d82cfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:23:48.352735 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:23:48.353466 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:23:48.354210 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:23:48.354819 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:24:09.734400 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0321 04:24:17.259238 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:24:17.259294 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:48Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.307778 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.318708 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.328922 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.339816 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.350464 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.360970 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.369508 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.382026 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.392448 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.406753 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.417675 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:48Z\\\",\\\"message\\\":\\\"2026-03-21T04:25:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413\\\\n2026-03-21T04:25:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413 to /host/opt/cni/bin/\\\\n2026-03-21T04:25:02Z [verbose] multus-daemon started\\\\n2026-03-21T04:25:02Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:25:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.426085 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.443426 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.452722 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.452900 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.453064 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:49 crc kubenswrapper[4839]: E0321 04:25:49.453056 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:49 crc kubenswrapper[4839]: E0321 04:25:49.453249 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:49 crc kubenswrapper[4839]: E0321 04:25:49.453394 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.454249 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.471075 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:26Z\\\",\\\"message\\\":\\\"e/networking-console-plugin-85b44fc459-gdk6g openshift-etcd/etcd-crc openshift-multus/multus-zqcw4 openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57 openshift-multus/network-metrics-daemon-445ww openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb]\\\\nI0321 04:25:26.224924 7118 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0321 04:25:26.224943 7118 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224952 7118 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224964 7118 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0321 04:25:26.224967 7118 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer becau\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:49 crc kubenswrapper[4839]: I0321 04:25:49.480229 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:49Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:50 crc kubenswrapper[4839]: I0321 04:25:50.451971 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:50 crc kubenswrapper[4839]: E0321 04:25:50.452243 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:51 crc kubenswrapper[4839]: I0321 04:25:51.452161 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:51 crc kubenswrapper[4839]: I0321 04:25:51.452188 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:51 crc kubenswrapper[4839]: I0321 04:25:51.452287 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:51 crc kubenswrapper[4839]: E0321 04:25:51.452332 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:51 crc kubenswrapper[4839]: E0321 04:25:51.452473 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:51 crc kubenswrapper[4839]: E0321 04:25:51.452538 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:51 crc kubenswrapper[4839]: E0321 04:25:51.545060 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:25:52 crc kubenswrapper[4839]: I0321 04:25:52.452492 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:52 crc kubenswrapper[4839]: E0321 04:25:52.452627 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:53 crc kubenswrapper[4839]: I0321 04:25:53.452979 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:53 crc kubenswrapper[4839]: I0321 04:25:53.453109 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:53 crc kubenswrapper[4839]: E0321 04:25:53.453182 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:53 crc kubenswrapper[4839]: I0321 04:25:53.453233 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:53 crc kubenswrapper[4839]: E0321 04:25:53.453383 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:53 crc kubenswrapper[4839]: E0321 04:25:53.453506 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:53 crc kubenswrapper[4839]: I0321 04:25:53.454669 4839 scope.go:117] "RemoveContainer" containerID="c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.286609 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/2.log" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.289581 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f"} Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.290050 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.308810 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.319670 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.340045 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:26Z\\\",\\\"message\\\":\\\"e/networking-console-plugin-85b44fc459-gdk6g openshift-etcd/etcd-crc openshift-multus/multus-zqcw4 openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57 openshift-multus/network-metrics-daemon-445ww openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb]\\\\nI0321 04:25:26.224924 7118 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0321 04:25:26.224943 7118 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224952 7118 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224964 7118 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0321 04:25:26.224967 7118 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer becau\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.350728 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.360339 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.375705 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99628ae3-9af9-4946-8e9f-2fc369d82cfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:23:48.352735 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:23:48.353466 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:23:48.354210 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:23:48.354819 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:24:09.734400 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0321 04:24:17.259238 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:24:17.259294 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:48Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.393145 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.404460 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.415794 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.426948 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.438126 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.448123 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.452437 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:54 crc kubenswrapper[4839]: E0321 04:25:54.452527 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.458182 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.468213 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.478253 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.490536 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.501696 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:48Z\\\",\\\"message\\\":\\\"2026-03-21T04:25:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413\\\\n2026-03-21T04:25:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413 to /host/opt/cni/bin/\\\\n2026-03-21T04:25:02Z [verbose] multus-daemon started\\\\n2026-03-21T04:25:02Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:25:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:54 crc kubenswrapper[4839]: I0321 04:25:54.510670 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.295314 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/3.log" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.296253 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/2.log" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.298855 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" exitCode=1 Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.298922 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f"} Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.298992 4839 scope.go:117] "RemoveContainer" containerID="c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.299948 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:25:55 crc kubenswrapper[4839]: E0321 04:25:55.300225 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.325416 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.340459 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.363720 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.380443 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:48Z\\\",\\\"message\\\":\\\"2026-03-21T04:25:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413\\\\n2026-03-21T04:25:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413 to /host/opt/cni/bin/\\\\n2026-03-21T04:25:02Z [verbose] multus-daemon started\\\\n2026-03-21T04:25:02Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:25:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.398559 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.420443 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.433427 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.452748 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:55 crc kubenswrapper[4839]: E0321 04:25:55.452892 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.452935 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.452996 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:55 crc kubenswrapper[4839]: E0321 04:25:55.453090 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:55 crc kubenswrapper[4839]: E0321 04:25:55.453173 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.459093 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3b3b070bcb4467cbc49a3b7bcae7caeb0458824dadce792fcc31a4a2df828b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:26Z\\\",\\\"message\\\":\\\"e/networking-console-plugin-85b44fc459-gdk6g openshift-etcd/etcd-crc openshift-multus/multus-zqcw4 openshift-network-diagnostics/network-check-target-xd92c openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57 openshift-multus/network-metrics-daemon-445ww openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-node-identity/network-node-identity-vrzqb]\\\\nI0321 04:25:26.224924 7118 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0321 04:25:26.224943 7118 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224952 7118 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0321 04:25:26.224964 7118 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nF0321 04:25:26.224967 7118 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer becau\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:54Z\\\",\\\"message\\\":\\\"led to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z]\\\\nI0321 04:25:54.232530 7446 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} typ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.470139 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.484325 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.498790 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99628ae3-9af9-4946-8e9f-2fc369d82cfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:23:48.352735 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:23:48.353466 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:23:48.354210 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:23:48.354819 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:24:09.734400 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0321 04:24:17.259238 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:24:17.259294 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:48Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.511680 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.522261 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.533509 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.542438 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.551877 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.563831 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.575931 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.898670 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.898721 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.898736 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.898756 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.898772 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:55Z","lastTransitionTime":"2026-03-21T04:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:55 crc kubenswrapper[4839]: E0321 04:25:55.912166 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.916254 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.916293 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.916305 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.916322 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.916334 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:55Z","lastTransitionTime":"2026-03-21T04:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:55 crc kubenswrapper[4839]: E0321 04:25:55.935714 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.940347 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.940382 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.940393 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.940407 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.940415 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:55Z","lastTransitionTime":"2026-03-21T04:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:55 crc kubenswrapper[4839]: E0321 04:25:55.956550 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.961740 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.961827 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.961865 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.961896 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.961920 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:55Z","lastTransitionTime":"2026-03-21T04:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:55 crc kubenswrapper[4839]: E0321 04:25:55.974408 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.978140 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.978198 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.978215 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.978242 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:25:55 crc kubenswrapper[4839]: I0321 04:25:55.978259 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:25:55Z","lastTransitionTime":"2026-03-21T04:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:25:55 crc kubenswrapper[4839]: E0321 04:25:55.990248 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:55Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:55 crc kubenswrapper[4839]: E0321 04:25:55.990375 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.303335 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/3.log" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.306752 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:25:56 crc kubenswrapper[4839]: E0321 04:25:56.306907 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.319515 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.330162 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.342465 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.353658 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.366523 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99628ae3-9af9-4946-8e9f-2fc369d82cfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:23:48.352735 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:23:48.353466 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:23:48.354210 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:23:48.354819 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:24:09.734400 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0321 04:24:17.259238 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:24:17.259294 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:48Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.378330 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.390520 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.401196 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.412689 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.426840 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.441176 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:48Z\\\",\\\"message\\\":\\\"2026-03-21T04:25:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413\\\\n2026-03-21T04:25:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413 to /host/opt/cni/bin/\\\\n2026-03-21T04:25:02Z [verbose] multus-daemon started\\\\n2026-03-21T04:25:02Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:25:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.452065 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:56 crc kubenswrapper[4839]: E0321 04:25:56.452329 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.452201 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.462283 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.466558 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.478788 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.498525 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:54Z\\\",\\\"message\\\":\\\"led to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z]\\\\nI0321 04:25:54.232530 7446 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} typ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.513150 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.523823 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: E0321 04:25:56.546060 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.547448 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.559889 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.569558 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8235ac9-7c3f-438e-957f-1bdedeff6f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c5299f0598312d0ef997d7c51fad5c0b882bd65e5964794ac66179575373fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b603ad92d6de73d444d98c4e9f38fd63b22eabeee61d7a3b00e8ca741016e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b603ad92d6de73d444d98c4e9f38fd63b22eabeee61d7a3b00e8ca741016e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.582409 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.593860 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.608003 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.621175 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.638918 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.654898 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:48Z\\\",\\\"message\\\":\\\"2026-03-21T04:25:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413\\\\n2026-03-21T04:25:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413 to /host/opt/cni/bin/\\\\n2026-03-21T04:25:02Z [verbose] multus-daemon started\\\\n2026-03-21T04:25:02Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:25:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.665408 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.683518 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.693764 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.710758 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:54Z\\\",\\\"message\\\":\\\"led to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z]\\\\nI0321 04:25:54.232530 7446 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} typ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.721062 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.730643 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.740627 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99628ae3-9af9-4946-8e9f-2fc369d82cfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:23:48.352735 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:23:48.353466 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:23:48.354210 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:23:48.354819 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:24:09.734400 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0321 04:24:17.259238 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:24:17.259294 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:48Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.752061 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.762901 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.774140 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:56 crc kubenswrapper[4839]: I0321 04:25:56.783698 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:56Z is after 2025-08-24T17:21:41Z" Mar 21 04:25:57 crc kubenswrapper[4839]: I0321 04:25:57.452737 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:57 crc kubenswrapper[4839]: I0321 04:25:57.452833 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:57 crc kubenswrapper[4839]: E0321 04:25:57.452877 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:25:57 crc kubenswrapper[4839]: I0321 04:25:57.452833 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:57 crc kubenswrapper[4839]: E0321 04:25:57.453031 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:57 crc kubenswrapper[4839]: E0321 04:25:57.453206 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:58 crc kubenswrapper[4839]: I0321 04:25:58.452273 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:25:58 crc kubenswrapper[4839]: E0321 04:25:58.452466 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:25:59 crc kubenswrapper[4839]: I0321 04:25:59.452126 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:25:59 crc kubenswrapper[4839]: I0321 04:25:59.452272 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:25:59 crc kubenswrapper[4839]: E0321 04:25:59.452281 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:25:59 crc kubenswrapper[4839]: E0321 04:25:59.452476 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:25:59 crc kubenswrapper[4839]: I0321 04:25:59.452730 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:25:59 crc kubenswrapper[4839]: E0321 04:25:59.452869 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:00 crc kubenswrapper[4839]: I0321 04:26:00.452405 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:00 crc kubenswrapper[4839]: E0321 04:26:00.453185 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:01 crc kubenswrapper[4839]: I0321 04:26:01.452728 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:01 crc kubenswrapper[4839]: I0321 04:26:01.452738 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:01 crc kubenswrapper[4839]: E0321 04:26:01.452883 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:01 crc kubenswrapper[4839]: E0321 04:26:01.452962 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:01 crc kubenswrapper[4839]: I0321 04:26:01.452816 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:01 crc kubenswrapper[4839]: E0321 04:26:01.453077 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:01 crc kubenswrapper[4839]: E0321 04:26:01.547552 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:26:02 crc kubenswrapper[4839]: I0321 04:26:02.451919 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:02 crc kubenswrapper[4839]: E0321 04:26:02.452035 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:03 crc kubenswrapper[4839]: I0321 04:26:03.452370 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:03 crc kubenswrapper[4839]: I0321 04:26:03.452417 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:03 crc kubenswrapper[4839]: I0321 04:26:03.452432 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:03 crc kubenswrapper[4839]: E0321 04:26:03.452533 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:03 crc kubenswrapper[4839]: E0321 04:26:03.452725 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:03 crc kubenswrapper[4839]: E0321 04:26:03.452771 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:04 crc kubenswrapper[4839]: I0321 04:26:04.452732 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:04 crc kubenswrapper[4839]: E0321 04:26:04.453513 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:05 crc kubenswrapper[4839]: I0321 04:26:05.452390 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:05 crc kubenswrapper[4839]: I0321 04:26:05.452388 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:05 crc kubenswrapper[4839]: E0321 04:26:05.452596 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:05 crc kubenswrapper[4839]: E0321 04:26:05.452611 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:05 crc kubenswrapper[4839]: I0321 04:26:05.452505 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:05 crc kubenswrapper[4839]: E0321 04:26:05.452816 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.368908 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.368953 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.368964 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.368980 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.368990 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:06Z","lastTransitionTime":"2026-03-21T04:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:06 crc kubenswrapper[4839]: E0321 04:26:06.381795 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.384935 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.384974 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.384983 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.384997 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.385006 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:06Z","lastTransitionTime":"2026-03-21T04:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:06 crc kubenswrapper[4839]: E0321 04:26:06.399597 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.404048 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.404096 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.404105 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.404119 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.404150 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:06Z","lastTransitionTime":"2026-03-21T04:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:06 crc kubenswrapper[4839]: E0321 04:26:06.415027 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.418734 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.418775 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.418789 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.418806 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.418817 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:06Z","lastTransitionTime":"2026-03-21T04:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:06 crc kubenswrapper[4839]: E0321 04:26:06.431805 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.435031 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.435069 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.435079 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.435095 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.435104 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:06Z","lastTransitionTime":"2026-03-21T04:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:06 crc kubenswrapper[4839]: E0321 04:26:06.445330 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: E0321 04:26:06.445489 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.452515 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:06 crc kubenswrapper[4839]: E0321 04:26:06.452693 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.466413 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99628ae3-9af9-4946-8e9f-2fc369d82cfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:23:48.352735 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:23:48.353466 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:23:48.354210 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:23:48.354819 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:24:09.734400 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0321 04:24:17.259238 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:24:17.259294 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:48Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.484922 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.496859 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.510579 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.523839 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.535394 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: E0321 04:26:06.547969 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.549114 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8235ac9-7c3f-438e-957f-1bdedeff6f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c5299f0598312d0ef997d7c51fad5c0b882bd65e5964794ac66179575373fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b603ad92d6de73d444d98c4e9f38fd63b22eabeee61d7a3b00e8ca741016e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b603ad92d6de73d444d98c4e9f38fd63b22eabeee61d7a3b00e8ca741016e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.562429 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.573466 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.585212 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.600483 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.614052 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.627334 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:48Z\\\",\\\"message\\\":\\\"2026-03-21T04:25:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413\\\\n2026-03-21T04:25:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413 to /host/opt/cni/bin/\\\\n2026-03-21T04:25:02Z [verbose] multus-daemon started\\\\n2026-03-21T04:25:02Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:25:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.640281 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.659115 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.671076 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.698297 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:54Z\\\",\\\"message\\\":\\\"led to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z]\\\\nI0321 04:25:54.232530 7446 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} typ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.709384 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:06 crc kubenswrapper[4839]: I0321 04:26:06.718386 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:06Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:07 crc kubenswrapper[4839]: I0321 04:26:07.452108 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:07 crc kubenswrapper[4839]: I0321 04:26:07.452149 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:07 crc kubenswrapper[4839]: I0321 04:26:07.452150 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:07 crc kubenswrapper[4839]: E0321 04:26:07.452266 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:07 crc kubenswrapper[4839]: E0321 04:26:07.452380 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:07 crc kubenswrapper[4839]: E0321 04:26:07.452431 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:08 crc kubenswrapper[4839]: I0321 04:26:08.452205 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:08 crc kubenswrapper[4839]: E0321 04:26:08.452331 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:09 crc kubenswrapper[4839]: I0321 04:26:09.452676 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:09 crc kubenswrapper[4839]: I0321 04:26:09.452767 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:09 crc kubenswrapper[4839]: E0321 04:26:09.452833 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:09 crc kubenswrapper[4839]: E0321 04:26:09.452910 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:09 crc kubenswrapper[4839]: I0321 04:26:09.454329 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:09 crc kubenswrapper[4839]: E0321 04:26:09.454531 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:10 crc kubenswrapper[4839]: I0321 04:26:10.451816 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:10 crc kubenswrapper[4839]: E0321 04:26:10.451959 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:10 crc kubenswrapper[4839]: I0321 04:26:10.452698 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:26:10 crc kubenswrapper[4839]: E0321 04:26:10.452944 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" Mar 21 04:26:11 crc kubenswrapper[4839]: I0321 04:26:11.452612 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:11 crc kubenswrapper[4839]: E0321 04:26:11.452730 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:11 crc kubenswrapper[4839]: I0321 04:26:11.452879 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:11 crc kubenswrapper[4839]: E0321 04:26:11.452923 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:11 crc kubenswrapper[4839]: I0321 04:26:11.453010 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:11 crc kubenswrapper[4839]: E0321 04:26:11.453058 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:11 crc kubenswrapper[4839]: E0321 04:26:11.549191 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:26:12 crc kubenswrapper[4839]: I0321 04:26:12.451959 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:12 crc kubenswrapper[4839]: E0321 04:26:12.452113 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:13 crc kubenswrapper[4839]: I0321 04:26:13.452832 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:13 crc kubenswrapper[4839]: E0321 04:26:13.453187 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:13 crc kubenswrapper[4839]: I0321 04:26:13.453499 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:13 crc kubenswrapper[4839]: E0321 04:26:13.453688 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:13 crc kubenswrapper[4839]: I0321 04:26:13.454028 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:13 crc kubenswrapper[4839]: E0321 04:26:13.454223 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:14 crc kubenswrapper[4839]: I0321 04:26:14.452529 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:14 crc kubenswrapper[4839]: E0321 04:26:14.452785 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:15 crc kubenswrapper[4839]: I0321 04:26:15.452769 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:15 crc kubenswrapper[4839]: I0321 04:26:15.452881 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:15 crc kubenswrapper[4839]: E0321 04:26:15.452918 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:15 crc kubenswrapper[4839]: I0321 04:26:15.452790 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:15 crc kubenswrapper[4839]: E0321 04:26:15.453135 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:15 crc kubenswrapper[4839]: E0321 04:26:15.453189 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.452782 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:16 crc kubenswrapper[4839]: E0321 04:26:16.452914 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.471963 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99628ae3-9af9-4946-8e9f-2fc369d82cfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:23:48.352735 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:23:48.353466 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:23:48.354210 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:23:48.354819 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:24:09.734400 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0321 04:24:17.259238 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:24:17.259294 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:48Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.487378 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.500840 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.515455 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.525051 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.534338 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.542778 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8235ac9-7c3f-438e-957f-1bdedeff6f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c5299f0598312d0ef997d7c51fad5c0b882bd65e5964794ac66179575373fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b603ad92d6de73d444d98c4e9f38fd63b22eabeee61d7a3b00e8ca741016e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b603ad92d6de73d444d98c4e9f38fd63b22eabeee61d7a3b00e8ca741016e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: E0321 04:26:16.549493 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.553957 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.567101 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.580637 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3ce591ac49d818720803d65a1f2cbc28e1781a5bd3cff7c6cdaaa4b55c01b62\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.585884 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.585921 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.585933 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.585950 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.585961 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:16Z","lastTransitionTime":"2026-03-21T04:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.595207 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: E0321 04:26:16.600455 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.605059 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.605105 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.605115 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.605130 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.605143 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:16Z","lastTransitionTime":"2026-03-21T04:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.608465 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-scp2c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0848faa-daf7-4b62-a20f-36d92678db1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ae22ec83bed051327f3e013af8f13d50dfab76794f5fbc972c4091173c8c464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ee6bb585321fd698c356e2ad9b8f48f62f2b8e09847371060e8ab0395a059ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6527e3c31caba53559cb6dfe4494b576789a0c8e3e5a7d392352441f1b9b1ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://688598020c8b1be055194d4805d234769230262450e5f816f35cc120ff5ff9a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://137b7d338371020059e20feb028cea9e79fa43d32ff7de446719a1326051c4f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6059c9b392e73f503a825716b885d248ddf659fe4fc1cd23f47353dd0fdc0e07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9367e3b70275a4fd5cc98faa785da0149e890de672bce34ddc831a46be5dd6b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6trk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-scp2c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: E0321 04:26:16.617443 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.620590 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.620638 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.620649 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.620665 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.620678 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:16Z","lastTransitionTime":"2026-03-21T04:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.621430 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zqcw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1602189b-f4f3-40ee-ba63-c695c11069d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:48Z\\\",\\\"message\\\":\\\"2026-03-21T04:25:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413\\\\n2026-03-21T04:25:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2a3febf-889d-41cb-9c6b-9024dff2f413 to /host/opt/cni/bin/\\\\n2026-03-21T04:25:02Z [verbose] multus-daemon started\\\\n2026-03-21T04:25:02Z [verbose] Readiness Indicator file check\\\\n2026-03-21T04:25:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f9gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zqcw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.631234 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-sxs57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e99177d8-5f41-4cee-a2c9-ae1c314d9d8d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4499202f3d135af4fbc0aaeddb85c3af3d571f0a6a0da31b0c054112cbeef85e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vqm8m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-sxs57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: E0321 04:26:16.632908 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.635808 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.635843 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.635851 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.635864 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.635876 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:16Z","lastTransitionTime":"2026-03-21T04:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:16 crc kubenswrapper[4839]: E0321 04:26:16.649186 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.649561 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845a5f72-fcd9-4e53-a65e-d875a0758312\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ac8aa1efe8200de679b27c91d85e8934dd8b958a487da8adfcb2e7fe4d4af83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5fcb9a692a8e68cdaf00e9ab50192286582dc94a1ec9c1f269deec71d12e709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a90276bf42ba474250d4e51c93b958e93ffcad8ab5f4285766b381f0247c484\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://418cc0271450c53990546d7ffb4506824d2a5889e49064cd40624ba291f485bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2aeb8f0a5e6e7302570f0ecb899e31a8f5e9ed506f9bf7cfa6a0a6d55ae50dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://843b9092466615ee64f818bbd1d6e51298c59932d1e135796967de09fb2f81bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://320bccb832e179c9ab109d22c5cd652cfb3bd770d2c698d91886401d9566efc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e30344cfa0c0c9d42845a517f744a156634132bd6c2f6e30f745751913e968c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.652927 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.652973 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.652986 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.653004 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.653018 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:16Z","lastTransitionTime":"2026-03-21T04:26:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.663347 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea7373c259189351899cd74615296d7188b2529f808af9c8c85098177ffbe85d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: E0321 04:26:16.666650 4839 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-21T04:26:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d75f4a6d-5ce3-49bd-a7c7-4f5c3e6bf62f\\\",\\\"systemUUID\\\":\\\"2a7bfad9-30ba-42d8-b982-971191ebb9d6\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: E0321 04:26:16.666769 4839 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.682785 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d634043b-c9ec-4469-b267-26053b1f02f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-21T04:25:54Z\\\",\\\"message\\\":\\\"led to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:25:54Z is after 2025-08-24T17:21:41Z]\\\\nI0321 04:25:54.232530 7446 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} typ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:25:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdph2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-spl4b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.695454 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dee692e-c3b8-4538-86d7-210dd7e96173\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6058f68ef58f6259be3064c7cea97ec261dc1174655dbe75d4caf208a9c300e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99d7bb8a699d6022c3036c24052e58f8f45699b88971a2137e3d8bc27a21fe56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6n8pc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9hl57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:16 crc kubenswrapper[4839]: I0321 04:26:16.705902 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-445ww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa13ce27-53f2-4178-8560-251f0bb3f034\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4b26h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-445ww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:16Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:17 crc kubenswrapper[4839]: I0321 04:26:17.219438 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:17 crc kubenswrapper[4839]: E0321 04:26:17.219619 4839 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:26:17 crc kubenswrapper[4839]: E0321 04:26:17.219680 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs podName:fa13ce27-53f2-4178-8560-251f0bb3f034 nodeName:}" failed. No retries permitted until 2026-03-21 04:27:21.219664728 +0000 UTC m=+245.547451394 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs") pod "network-metrics-daemon-445ww" (UID: "fa13ce27-53f2-4178-8560-251f0bb3f034") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 21 04:26:17 crc kubenswrapper[4839]: I0321 04:26:17.452115 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:17 crc kubenswrapper[4839]: I0321 04:26:17.452130 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:17 crc kubenswrapper[4839]: E0321 04:26:17.452326 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:17 crc kubenswrapper[4839]: E0321 04:26:17.452427 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:17 crc kubenswrapper[4839]: I0321 04:26:17.452512 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:17 crc kubenswrapper[4839]: E0321 04:26:17.452729 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:18 crc kubenswrapper[4839]: I0321 04:26:18.452164 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:18 crc kubenswrapper[4839]: E0321 04:26:18.452349 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:19 crc kubenswrapper[4839]: I0321 04:26:19.451763 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:19 crc kubenswrapper[4839]: E0321 04:26:19.452199 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:19 crc kubenswrapper[4839]: I0321 04:26:19.451836 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:19 crc kubenswrapper[4839]: E0321 04:26:19.452899 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:19 crc kubenswrapper[4839]: I0321 04:26:19.451799 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:19 crc kubenswrapper[4839]: E0321 04:26:19.453146 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:20 crc kubenswrapper[4839]: I0321 04:26:20.452553 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:20 crc kubenswrapper[4839]: E0321 04:26:20.452800 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:21 crc kubenswrapper[4839]: I0321 04:26:21.451988 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:21 crc kubenswrapper[4839]: I0321 04:26:21.452074 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:21 crc kubenswrapper[4839]: E0321 04:26:21.452453 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:21 crc kubenswrapper[4839]: I0321 04:26:21.452095 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:21 crc kubenswrapper[4839]: E0321 04:26:21.452818 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:21 crc kubenswrapper[4839]: E0321 04:26:21.452996 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:21 crc kubenswrapper[4839]: E0321 04:26:21.550832 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:26:22 crc kubenswrapper[4839]: I0321 04:26:22.452705 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:22 crc kubenswrapper[4839]: E0321 04:26:22.453139 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:22 crc kubenswrapper[4839]: I0321 04:26:22.453278 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:26:22 crc kubenswrapper[4839]: E0321 04:26:22.453397 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" Mar 21 04:26:23 crc kubenswrapper[4839]: I0321 04:26:23.452450 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:23 crc kubenswrapper[4839]: I0321 04:26:23.452485 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:23 crc kubenswrapper[4839]: I0321 04:26:23.452450 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:23 crc kubenswrapper[4839]: E0321 04:26:23.452604 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:23 crc kubenswrapper[4839]: E0321 04:26:23.452676 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:23 crc kubenswrapper[4839]: E0321 04:26:23.452726 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:24 crc kubenswrapper[4839]: I0321 04:26:24.452023 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:24 crc kubenswrapper[4839]: E0321 04:26:24.452160 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:25 crc kubenswrapper[4839]: I0321 04:26:25.452001 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:25 crc kubenswrapper[4839]: I0321 04:26:25.452016 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:25 crc kubenswrapper[4839]: I0321 04:26:25.453253 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:25 crc kubenswrapper[4839]: E0321 04:26:25.453487 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:25 crc kubenswrapper[4839]: E0321 04:26:25.453617 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:25 crc kubenswrapper[4839]: E0321 04:26:25.453726 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.453022 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:26 crc kubenswrapper[4839]: E0321 04:26:26.454412 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.474996 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17d5f53fe3137552caf74bf775b44fba819d8790d2bc4d5042744dbd7b7307ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e480aacd9f65a8a4ccedd9c0a943c53bf6de8a8de00eb0d421ec782af8a785bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.493476 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g47qh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e646dbcd-c976-48e4-8dee-497be8a275bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ceedbcee674106a8f519fededf37b2bbc4b4bd660845373899de825f1ed534e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ljz2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g47qh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.512847 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99628ae3-9af9-4946-8e9f-2fc369d82cfa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7ef331cbf4076bc661dd93ec1669ee8b721edbac57aeef3df25f1e36527065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed71f104d54976b384a414c08b933d4f7185883ea0e2dc638863d67f9b245505\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:17Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0321 04:23:48.352735 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0321 04:23:48.353466 1 observer_polling.go:159] Starting file observer\\\\nI0321 04:23:48.354210 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0321 04:23:48.354819 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0321 04:24:09.734400 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0321 04:24:17.259238 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0321 04:24:17.259294 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:48Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac4ccdf976fbadd5ce5975dbec37b6be3b17b401a863ed5c1e93736176e1185e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f432640080695974eb1c6c1297fad14596d99d17a162d541417fb3ec762575f5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.529449 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-21T04:24:14Z\\\",\\\"message\\\":\\\"W0321 04:24:13.708919 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0321 04:24:13.709951 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774067053 cert, and key in /tmp/serving-cert-1138089316/serving-signer.crt, /tmp/serving-cert-1138089316/serving-signer.key\\\\nI0321 04:24:14.009807 1 observer_polling.go:159] Starting file observer\\\\nW0321 04:24:14.016906 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0321 04:24:14.017068 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0321 04:24:14.017655 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138089316/tls.crt::/tmp/serving-cert-1138089316/tls.key\\\\\\\"\\\\nF0321 04:24:14.917069 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-21T04:24:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:24:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.545909 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:26 crc kubenswrapper[4839]: E0321 04:26:26.551503 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.566749 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:25:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fc18a359e07bf64662248d7ceb65c8184407c0c4e14ce5483760975218407c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:25:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9jqbw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:25:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jx4q7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.581247 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b27e726e-fb6e-471f-b189-71637a34994c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://506189caf9101bd8eff0b327400189bde29280613baf494135bf28fa36bd80cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ea5d4d7d744a79c2e5515eaf49d09346d96b37b17eee6e3e32952e389248367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1981f2299df7154d30ce01492943eaadeeed0f27c0b0a39a1228054345aa7a3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fec5f0e1a98ef435cb8b280400cfc7daff0f232d320947f22a93df0b81b6f441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.597828 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8235ac9-7c3f-438e-957f-1bdedeff6f64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-21T04:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c5299f0598312d0ef997d7c51fad5c0b882bd65e5964794ac66179575373fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-21T04:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b603ad92d6de73d444d98c4e9f38fd63b22eabeee61d7a3b00e8ca741016e68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b603ad92d6de73d444d98c4e9f38fd63b22eabeee61d7a3b00e8ca741016e68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:23:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-21T04:23:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-21T04:26:26Z is after 2025-08-24T17:21:41Z" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.686036 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zqcw4" podStartSLOduration=126.686018502 podStartE2EDuration="2m6.686018502s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:26.674672125 +0000 UTC m=+191.002458801" watchObservedRunningTime="2026-03-21 04:26:26.686018502 +0000 UTC m=+191.013805178" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.705141 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-sxs57" podStartSLOduration=126.705122371 podStartE2EDuration="2m6.705122371s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:26.687441185 +0000 UTC m=+191.015227861" watchObservedRunningTime="2026-03-21 04:26:26.705122371 +0000 UTC m=+191.032909047" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.734001 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-scp2c" podStartSLOduration=126.733983879 podStartE2EDuration="2m6.733983879s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:26.733559296 +0000 UTC m=+191.061345972" watchObservedRunningTime="2026-03-21 04:26:26.733983879 +0000 UTC m=+191.061770555" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.753774 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9hl57" podStartSLOduration=125.753754097 podStartE2EDuration="2m5.753754097s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:26.745602135 +0000 UTC m=+191.073388831" watchObservedRunningTime="2026-03-21 04:26:26.753754097 +0000 UTC m=+191.081540773" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.774532 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=93.774518405 podStartE2EDuration="1m33.774518405s" podCreationTimestamp="2026-03-21 04:24:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:26.773670589 +0000 UTC m=+191.101457265" watchObservedRunningTime="2026-03-21 04:26:26.774518405 +0000 UTC m=+191.102305081" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.863510 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.863549 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.863610 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.863627 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.863638 4839 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-21T04:26:26Z","lastTransitionTime":"2026-03-21T04:26:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.908963 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d"] Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.910228 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.916057 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.916304 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.916381 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.916935 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.929309 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=52.929281818 podStartE2EDuration="52.929281818s" podCreationTimestamp="2026-03-21 04:25:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:26.928112843 +0000 UTC m=+191.255899519" watchObservedRunningTime="2026-03-21 04:26:26.929281818 +0000 UTC m=+191.257068534" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.943016 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=30.942996766 podStartE2EDuration="30.942996766s" podCreationTimestamp="2026-03-21 04:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:26.942361097 +0000 UTC m=+191.270147803" watchObservedRunningTime="2026-03-21 04:26:26.942996766 +0000 UTC m=+191.270783442" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.955174 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podStartSLOduration=126.955152608 podStartE2EDuration="2m6.955152608s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:26.954992563 +0000 UTC m=+191.282779239" watchObservedRunningTime="2026-03-21 04:26:26.955152608 +0000 UTC m=+191.282939284" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.995515 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=47.995499108 podStartE2EDuration="47.995499108s" podCreationTimestamp="2026-03-21 04:25:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:26.982527292 +0000 UTC m=+191.310313968" watchObservedRunningTime="2026-03-21 04:26:26.995499108 +0000 UTC m=+191.323285784" Mar 21 04:26:26 crc kubenswrapper[4839]: I0321 04:26:26.996244 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=99.99623707 podStartE2EDuration="1m39.99623707s" podCreationTimestamp="2026-03-21 04:24:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:26.995422306 +0000 UTC m=+191.323208992" watchObservedRunningTime="2026-03-21 04:26:26.99623707 +0000 UTC m=+191.324023746" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.016815 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0207a845-18d9-4431-844b-4bd01600c2d5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.016951 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0207a845-18d9-4431-844b-4bd01600c2d5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.017002 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0207a845-18d9-4431-844b-4bd01600c2d5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.017043 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0207a845-18d9-4431-844b-4bd01600c2d5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.017158 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0207a845-18d9-4431-844b-4bd01600c2d5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.118242 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0207a845-18d9-4431-844b-4bd01600c2d5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.118521 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0207a845-18d9-4431-844b-4bd01600c2d5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.118737 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0207a845-18d9-4431-844b-4bd01600c2d5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.118912 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0207a845-18d9-4431-844b-4bd01600c2d5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.118598 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0207a845-18d9-4431-844b-4bd01600c2d5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.119011 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0207a845-18d9-4431-844b-4bd01600c2d5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.119044 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0207a845-18d9-4431-844b-4bd01600c2d5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.119164 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0207a845-18d9-4431-844b-4bd01600c2d5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.126154 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0207a845-18d9-4431-844b-4bd01600c2d5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.140008 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0207a845-18d9-4431-844b-4bd01600c2d5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vbv7d\" (UID: \"0207a845-18d9-4431-844b-4bd01600c2d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.229057 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.404333 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" event={"ID":"0207a845-18d9-4431-844b-4bd01600c2d5","Type":"ContainerStarted","Data":"7f5715c87d2d5e7a3ef02da86248f54671904f0c67edbbfc034461ba914a9b40"} Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.452043 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.452114 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.452131 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:27 crc kubenswrapper[4839]: E0321 04:26:27.452205 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:27 crc kubenswrapper[4839]: E0321 04:26:27.452260 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:27 crc kubenswrapper[4839]: E0321 04:26:27.452319 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.505257 4839 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 21 04:26:27 crc kubenswrapper[4839]: I0321 04:26:27.513643 4839 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 21 04:26:28 crc kubenswrapper[4839]: I0321 04:26:28.409362 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" event={"ID":"0207a845-18d9-4431-844b-4bd01600c2d5","Type":"ContainerStarted","Data":"3d8f42dbc76c69b64b7fd8850007b7710aac379b8538e5eb25782151e9647dac"} Mar 21 04:26:28 crc kubenswrapper[4839]: I0321 04:26:28.429139 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-g47qh" podStartSLOduration=128.429120201 podStartE2EDuration="2m8.429120201s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:27.030303313 +0000 UTC m=+191.358089989" watchObservedRunningTime="2026-03-21 04:26:28.429120201 +0000 UTC m=+192.756906897" Mar 21 04:26:28 crc kubenswrapper[4839]: I0321 04:26:28.452307 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:28 crc kubenswrapper[4839]: E0321 04:26:28.452550 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:29 crc kubenswrapper[4839]: I0321 04:26:29.452166 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:29 crc kubenswrapper[4839]: I0321 04:26:29.452236 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:29 crc kubenswrapper[4839]: I0321 04:26:29.452169 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:29 crc kubenswrapper[4839]: E0321 04:26:29.452282 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:29 crc kubenswrapper[4839]: E0321 04:26:29.452317 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:29 crc kubenswrapper[4839]: E0321 04:26:29.452360 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:30 crc kubenswrapper[4839]: I0321 04:26:30.452055 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:30 crc kubenswrapper[4839]: E0321 04:26:30.452405 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:31 crc kubenswrapper[4839]: I0321 04:26:31.451803 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:31 crc kubenswrapper[4839]: I0321 04:26:31.451892 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:31 crc kubenswrapper[4839]: I0321 04:26:31.451970 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:31 crc kubenswrapper[4839]: E0321 04:26:31.451974 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:31 crc kubenswrapper[4839]: E0321 04:26:31.452017 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:31 crc kubenswrapper[4839]: E0321 04:26:31.452097 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:31 crc kubenswrapper[4839]: E0321 04:26:31.553099 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:26:32 crc kubenswrapper[4839]: I0321 04:26:32.452711 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:32 crc kubenswrapper[4839]: E0321 04:26:32.452883 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:33 crc kubenswrapper[4839]: I0321 04:26:33.451835 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:33 crc kubenswrapper[4839]: E0321 04:26:33.451970 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:33 crc kubenswrapper[4839]: I0321 04:26:33.452017 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:33 crc kubenswrapper[4839]: I0321 04:26:33.452181 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:33 crc kubenswrapper[4839]: E0321 04:26:33.452247 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:33 crc kubenswrapper[4839]: E0321 04:26:33.452601 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:33 crc kubenswrapper[4839]: I0321 04:26:33.453277 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:26:33 crc kubenswrapper[4839]: E0321 04:26:33.453422 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-spl4b_openshift-ovn-kubernetes(d634043b-c9ec-4469-b267-26053b1f02f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" Mar 21 04:26:34 crc kubenswrapper[4839]: I0321 04:26:34.425342 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/1.log" Mar 21 04:26:34 crc kubenswrapper[4839]: I0321 04:26:34.426067 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/0.log" Mar 21 04:26:34 crc kubenswrapper[4839]: I0321 04:26:34.426133 4839 generic.go:334] "Generic (PLEG): container finished" podID="1602189b-f4f3-40ee-ba63-c695c11069d0" containerID="bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1" exitCode=1 Mar 21 04:26:34 crc kubenswrapper[4839]: I0321 04:26:34.426168 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqcw4" event={"ID":"1602189b-f4f3-40ee-ba63-c695c11069d0","Type":"ContainerDied","Data":"bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1"} Mar 21 04:26:34 crc kubenswrapper[4839]: I0321 04:26:34.426206 4839 scope.go:117] "RemoveContainer" containerID="abb804e65728bf323531d9b35e52f6700584bab85de4a5b749067d2d89224747" Mar 21 04:26:34 crc kubenswrapper[4839]: I0321 04:26:34.426590 4839 scope.go:117] "RemoveContainer" containerID="bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1" Mar 21 04:26:34 crc kubenswrapper[4839]: E0321 04:26:34.426772 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-zqcw4_openshift-multus(1602189b-f4f3-40ee-ba63-c695c11069d0)\"" pod="openshift-multus/multus-zqcw4" podUID="1602189b-f4f3-40ee-ba63-c695c11069d0" Mar 21 04:26:34 crc kubenswrapper[4839]: I0321 04:26:34.441514 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vbv7d" podStartSLOduration=134.441498104 podStartE2EDuration="2m14.441498104s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:28.428552884 +0000 UTC m=+192.756339600" watchObservedRunningTime="2026-03-21 04:26:34.441498104 +0000 UTC m=+198.769284780" Mar 21 04:26:34 crc kubenswrapper[4839]: I0321 04:26:34.452095 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:34 crc kubenswrapper[4839]: E0321 04:26:34.452247 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:35 crc kubenswrapper[4839]: I0321 04:26:35.430071 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/1.log" Mar 21 04:26:35 crc kubenswrapper[4839]: I0321 04:26:35.452483 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:35 crc kubenswrapper[4839]: I0321 04:26:35.452483 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:35 crc kubenswrapper[4839]: E0321 04:26:35.452642 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:35 crc kubenswrapper[4839]: E0321 04:26:35.452698 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:35 crc kubenswrapper[4839]: I0321 04:26:35.452488 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:35 crc kubenswrapper[4839]: E0321 04:26:35.452794 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:36 crc kubenswrapper[4839]: I0321 04:26:36.451999 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:36 crc kubenswrapper[4839]: E0321 04:26:36.452763 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:36 crc kubenswrapper[4839]: E0321 04:26:36.553875 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:26:37 crc kubenswrapper[4839]: I0321 04:26:37.452040 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:37 crc kubenswrapper[4839]: I0321 04:26:37.452139 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:37 crc kubenswrapper[4839]: I0321 04:26:37.452220 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:37 crc kubenswrapper[4839]: E0321 04:26:37.452155 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:37 crc kubenswrapper[4839]: E0321 04:26:37.452323 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:37 crc kubenswrapper[4839]: E0321 04:26:37.452453 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:38 crc kubenswrapper[4839]: I0321 04:26:38.452502 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:38 crc kubenswrapper[4839]: E0321 04:26:38.452762 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:39 crc kubenswrapper[4839]: I0321 04:26:39.452724 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:39 crc kubenswrapper[4839]: I0321 04:26:39.452835 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:39 crc kubenswrapper[4839]: E0321 04:26:39.452874 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:39 crc kubenswrapper[4839]: I0321 04:26:39.452725 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:39 crc kubenswrapper[4839]: E0321 04:26:39.452999 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:39 crc kubenswrapper[4839]: E0321 04:26:39.453169 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:40 crc kubenswrapper[4839]: I0321 04:26:40.451922 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:40 crc kubenswrapper[4839]: E0321 04:26:40.452053 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:41 crc kubenswrapper[4839]: I0321 04:26:41.452029 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:41 crc kubenswrapper[4839]: I0321 04:26:41.452051 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:41 crc kubenswrapper[4839]: I0321 04:26:41.452140 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:41 crc kubenswrapper[4839]: E0321 04:26:41.452141 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:41 crc kubenswrapper[4839]: E0321 04:26:41.452266 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:41 crc kubenswrapper[4839]: E0321 04:26:41.452378 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:41 crc kubenswrapper[4839]: E0321 04:26:41.555249 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:26:42 crc kubenswrapper[4839]: I0321 04:26:42.451943 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:42 crc kubenswrapper[4839]: E0321 04:26:42.452079 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:43 crc kubenswrapper[4839]: I0321 04:26:43.451755 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:43 crc kubenswrapper[4839]: I0321 04:26:43.451785 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:43 crc kubenswrapper[4839]: E0321 04:26:43.451894 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:43 crc kubenswrapper[4839]: I0321 04:26:43.451905 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:43 crc kubenswrapper[4839]: E0321 04:26:43.452005 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:43 crc kubenswrapper[4839]: E0321 04:26:43.452077 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:44 crc kubenswrapper[4839]: I0321 04:26:44.452743 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:44 crc kubenswrapper[4839]: E0321 04:26:44.453143 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:44 crc kubenswrapper[4839]: I0321 04:26:44.453492 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:26:45 crc kubenswrapper[4839]: I0321 04:26:45.307355 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-445ww"] Mar 21 04:26:45 crc kubenswrapper[4839]: I0321 04:26:45.307502 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:45 crc kubenswrapper[4839]: E0321 04:26:45.307630 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:45 crc kubenswrapper[4839]: I0321 04:26:45.451897 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:45 crc kubenswrapper[4839]: I0321 04:26:45.452271 4839 scope.go:117] "RemoveContainer" containerID="bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1" Mar 21 04:26:45 crc kubenswrapper[4839]: I0321 04:26:45.451969 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:45 crc kubenswrapper[4839]: E0321 04:26:45.452276 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:45 crc kubenswrapper[4839]: E0321 04:26:45.452496 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:45 crc kubenswrapper[4839]: I0321 04:26:45.460559 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/3.log" Mar 21 04:26:45 crc kubenswrapper[4839]: I0321 04:26:45.465686 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerStarted","Data":"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e"} Mar 21 04:26:45 crc kubenswrapper[4839]: I0321 04:26:45.466057 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:26:45 crc kubenswrapper[4839]: I0321 04:26:45.510845 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podStartSLOduration=145.510823351 podStartE2EDuration="2m25.510823351s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:26:45.509840532 +0000 UTC m=+209.837627208" watchObservedRunningTime="2026-03-21 04:26:45.510823351 +0000 UTC m=+209.838610037" Mar 21 04:26:46 crc kubenswrapper[4839]: I0321 04:26:46.452295 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:46 crc kubenswrapper[4839]: E0321 04:26:46.453819 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:46 crc kubenswrapper[4839]: I0321 04:26:46.469507 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/1.log" Mar 21 04:26:46 crc kubenswrapper[4839]: I0321 04:26:46.469627 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqcw4" event={"ID":"1602189b-f4f3-40ee-ba63-c695c11069d0","Type":"ContainerStarted","Data":"44c7b00e724e15bccb8ef54953306d49bc029cd21069ea40d7f724706be68de4"} Mar 21 04:26:46 crc kubenswrapper[4839]: E0321 04:26:46.555835 4839 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 21 04:26:47 crc kubenswrapper[4839]: I0321 04:26:47.452371 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:47 crc kubenswrapper[4839]: I0321 04:26:47.452377 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:47 crc kubenswrapper[4839]: E0321 04:26:47.452596 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:47 crc kubenswrapper[4839]: I0321 04:26:47.452384 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:47 crc kubenswrapper[4839]: E0321 04:26:47.452708 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:47 crc kubenswrapper[4839]: E0321 04:26:47.453116 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:48 crc kubenswrapper[4839]: I0321 04:26:48.452262 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:48 crc kubenswrapper[4839]: E0321 04:26:48.452478 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:49 crc kubenswrapper[4839]: I0321 04:26:49.359533 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.359733 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:28:51.359698362 +0000 UTC m=+335.687485058 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:26:49 crc kubenswrapper[4839]: I0321 04:26:49.360116 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:49 crc kubenswrapper[4839]: I0321 04:26:49.360180 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.360293 4839 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.360385 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:28:51.360364872 +0000 UTC m=+335.688151548 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.360304 4839 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.360466 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:28:51.360455935 +0000 UTC m=+335.688242631 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 21 04:26:49 crc kubenswrapper[4839]: I0321 04:26:49.452369 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.452516 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:49 crc kubenswrapper[4839]: I0321 04:26:49.452604 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.452685 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:49 crc kubenswrapper[4839]: I0321 04:26:49.452800 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.452956 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:49 crc kubenswrapper[4839]: I0321 04:26:49.460973 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:49 crc kubenswrapper[4839]: I0321 04:26:49.461060 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.461212 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.461225 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.461268 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.461280 4839 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.461337 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:28:51.461320581 +0000 UTC m=+335.789107257 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.461234 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.461443 4839 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:26:49 crc kubenswrapper[4839]: E0321 04:26:49.461501 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:28:51.461482976 +0000 UTC m=+335.789269752 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 21 04:26:50 crc kubenswrapper[4839]: I0321 04:26:50.452479 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:50 crc kubenswrapper[4839]: E0321 04:26:50.452664 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:26:51 crc kubenswrapper[4839]: I0321 04:26:51.452007 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:51 crc kubenswrapper[4839]: I0321 04:26:51.452030 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:51 crc kubenswrapper[4839]: I0321 04:26:51.452052 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:51 crc kubenswrapper[4839]: E0321 04:26:51.452141 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-445ww" podUID="fa13ce27-53f2-4178-8560-251f0bb3f034" Mar 21 04:26:51 crc kubenswrapper[4839]: E0321 04:26:51.452290 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:26:51 crc kubenswrapper[4839]: E0321 04:26:51.452356 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:26:52 crc kubenswrapper[4839]: I0321 04:26:52.451863 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:26:52 crc kubenswrapper[4839]: I0321 04:26:52.454194 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 21 04:26:52 crc kubenswrapper[4839]: I0321 04:26:52.454688 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 21 04:26:53 crc kubenswrapper[4839]: I0321 04:26:53.452122 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:26:53 crc kubenswrapper[4839]: I0321 04:26:53.452205 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:26:53 crc kubenswrapper[4839]: I0321 04:26:53.452122 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:26:53 crc kubenswrapper[4839]: I0321 04:26:53.454399 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 21 04:26:53 crc kubenswrapper[4839]: I0321 04:26:53.454522 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 21 04:26:53 crc kubenswrapper[4839]: I0321 04:26:53.458910 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 21 04:26:53 crc kubenswrapper[4839]: I0321 04:26:53.461360 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.545241 4839 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.579692 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-45jfn"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.580197 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.581146 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.581586 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.582698 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nmj8p"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.583193 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.584201 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.584277 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.584558 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.585892 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-k5nwf"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.586453 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.589548 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.589548 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.589654 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.592392 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.592993 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.594855 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.596074 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.596362 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.596526 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.596692 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.596954 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.597076 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.597500 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.597703 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.597732 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.597635 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.597917 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.598164 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.598252 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.598354 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.598443 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.598359 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.598614 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.598902 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.598624 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.598676 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.600280 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.601198 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.602350 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.603177 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.603268 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.603524 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.603774 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.603915 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.603780 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.604106 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.604221 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.604351 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gl7rc"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.604404 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.604486 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.605292 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.606304 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zt77f"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.607099 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.607102 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.614640 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.607173 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.608388 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.616496 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.617204 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.618022 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.618267 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.618980 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.619326 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.619426 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.619532 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.619621 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.619686 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.620183 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.620319 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.620393 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.620473 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.620552 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.620649 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.621137 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.621806 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.621903 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.622029 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.622144 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.622235 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.634477 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hkg98"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.643877 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.643918 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-qp8mz"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.652910 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.655015 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2s6j7"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.655400 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.655616 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qp8mz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.656343 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.656472 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.656623 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.656702 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.657769 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-bj929"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.657802 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.658211 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.658815 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g2rrh"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.659407 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.659864 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.659741 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.660901 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.661233 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-config\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.667955 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-images\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.667985 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp2sf\" (UniqueName: \"kubernetes.io/projected/3498feaf-72d5-471a-b25e-fb4b68875767-kube-api-access-wp2sf\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668011 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-client-ca\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668039 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67adff78-dfe5-440a-80b0-fefd703c3aa7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668067 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/67adff78-dfe5-440a-80b0-fefd703c3aa7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668103 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668149 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3498feaf-72d5-471a-b25e-fb4b68875767-serving-cert\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668173 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29czp\" (UniqueName: \"kubernetes.io/projected/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-kube-api-access-29czp\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668192 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/67adff78-dfe5-440a-80b0-fefd703c3aa7-encryption-config\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668213 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/67adff78-dfe5-440a-80b0-fefd703c3aa7-audit-policies\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668230 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/67adff78-dfe5-440a-80b0-fefd703c3aa7-etcd-client\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668247 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2bf4\" (UniqueName: \"kubernetes.io/projected/67adff78-dfe5-440a-80b0-fefd703c3aa7-kube-api-access-f2bf4\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668294 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67adff78-dfe5-440a-80b0-fefd703c3aa7-audit-dir\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668317 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-config\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668340 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67adff78-dfe5-440a-80b0-fefd703c3aa7-serving-cert\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.668360 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.659997 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.660030 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.660127 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.660224 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.660854 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.661248 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.661294 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.661356 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.661378 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.662294 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.663817 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.665096 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.665141 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.665558 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.665659 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.665759 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.665789 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.665816 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.660170 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nmj8p"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.670187 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-45jfn"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.671621 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.672735 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.673365 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.673631 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.673794 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.673890 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.674606 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.674723 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.674818 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.674915 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.675009 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.675111 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.675206 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.675619 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.676149 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.676356 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.683412 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.686441 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.687629 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.687809 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.687869 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.688073 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.693265 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.697065 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.697602 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.697809 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.697862 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.698084 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.698310 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.698368 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.701611 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zt77f"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.701658 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.714984 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.716002 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.718250 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ql2ps"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.718319 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.720324 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.734304 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.735171 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.735847 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.736351 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gl7rc"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.736461 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.736834 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.738492 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.739148 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.742790 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.743693 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.744507 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.744836 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.745339 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.750114 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-w6dzs"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.750897 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.754184 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qp8mz"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.754228 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-k5nwf"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.755516 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g2rrh"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.757743 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2s6j7"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.759885 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.759933 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.763057 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.764089 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8jgh7"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.764914 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.767667 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.768448 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769235 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67adff78-dfe5-440a-80b0-fefd703c3aa7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769283 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-oauth-config\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769312 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769341 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8sv4j\" (UID: \"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769363 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssrbd\" (UniqueName: \"kubernetes.io/projected/cae2f42f-b7c7-43c7-b397-a8273ea5844b-kube-api-access-ssrbd\") pod \"dns-operator-744455d44c-g2rrh\" (UID: \"cae2f42f-b7c7-43c7-b397-a8273ea5844b\") " pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769385 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pwc7\" (UniqueName: \"kubernetes.io/projected/9d291bc8-87c0-4a9e-b269-52a7801f050b-kube-api-access-9pwc7\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769424 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/67adff78-dfe5-440a-80b0-fefd703c3aa7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769462 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769489 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9d291bc8-87c0-4a9e-b269-52a7801f050b-encryption-config\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769511 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93bc1508-a828-4d23-b078-1d4164d1bc2c-serving-cert\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769536 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93bc1508-a828-4d23-b078-1d4164d1bc2c-config\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769557 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-etcd-serving-ca\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769606 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-auth-proxy-config\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769641 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r66pp\" (UniqueName: \"kubernetes.io/projected/4d63cdfd-21e7-4a63-960b-363fb131ac08-kube-api-access-r66pp\") pod \"downloads-7954f5f757-qp8mz\" (UID: \"4d63cdfd-21e7-4a63-960b-363fb131ac08\") " pod="openshift-console/downloads-7954f5f757-qp8mz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769667 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-config\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769702 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-config\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769728 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3498feaf-72d5-471a-b25e-fb4b68875767-serving-cert\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769752 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9d291bc8-87c0-4a9e-b269-52a7801f050b-etcd-client\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769777 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29czp\" (UniqueName: \"kubernetes.io/projected/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-kube-api-access-29czp\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769801 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a79fc033-c671-42ff-aa06-78ae64967c92-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4cwc5\" (UID: \"a79fc033-c671-42ff-aa06-78ae64967c92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769825 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769850 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/67adff78-dfe5-440a-80b0-fefd703c3aa7-encryption-config\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769876 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b60d6f1b-b109-4fa4-a85d-ebb845b342bd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p4nnp\" (UID: \"b60d6f1b-b109-4fa4-a85d-ebb845b342bd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769908 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2bf4\" (UniqueName: \"kubernetes.io/projected/67adff78-dfe5-440a-80b0-fefd703c3aa7-kube-api-access-f2bf4\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769918 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67adff78-dfe5-440a-80b0-fefd703c3aa7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769934 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b60d6f1b-b109-4fa4-a85d-ebb845b342bd-serving-cert\") pod \"openshift-config-operator-7777fb866f-p4nnp\" (UID: \"b60d6f1b-b109-4fa4-a85d-ebb845b342bd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.769977 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/67adff78-dfe5-440a-80b0-fefd703c3aa7-audit-policies\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770379 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/67adff78-dfe5-440a-80b0-fefd703c3aa7-audit-policies\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770419 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/67adff78-dfe5-440a-80b0-fefd703c3aa7-etcd-client\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770457 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770479 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770518 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770539 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770556 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8sv4j\" (UID: \"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770606 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/93bc1508-a828-4d23-b078-1d4164d1bc2c-etcd-client\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770623 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-dir\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770640 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770660 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q47zt\" (UniqueName: \"kubernetes.io/projected/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-kube-api-access-q47zt\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770677 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770715 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxvqw\" (UniqueName: \"kubernetes.io/projected/93bc1508-a828-4d23-b078-1d4164d1bc2c-kube-api-access-dxvqw\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770734 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a79fc033-c671-42ff-aa06-78ae64967c92-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4cwc5\" (UID: \"a79fc033-c671-42ff-aa06-78ae64967c92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770751 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25af4e9d-c029-4ee7-9952-18a3a5e3c333-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tz8sp\" (UID: \"25af4e9d-c029-4ee7-9952-18a3a5e3c333\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770769 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7156267-6917-4c54-ba75-4a91a0772025-config\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770784 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-service-ca-bundle\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770802 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770820 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-serving-cert\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770860 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-config\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770876 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67adff78-dfe5-440a-80b0-fefd703c3aa7-audit-dir\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770894 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770916 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-service-ca\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770932 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8kjw\" (UniqueName: \"kubernetes.io/projected/a79fc033-c671-42ff-aa06-78ae64967c92-kube-api-access-s8kjw\") pod \"openshift-controller-manager-operator-756b6f6bc6-4cwc5\" (UID: \"a79fc033-c671-42ff-aa06-78ae64967c92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770949 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-config\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770967 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67adff78-dfe5-440a-80b0-fefd703c3aa7-serving-cert\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.770983 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771003 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771082 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-config\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771100 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zmdv\" (UniqueName: \"kubernetes.io/projected/f7156267-6917-4c54-ba75-4a91a0772025-kube-api-access-5zmdv\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771118 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94mt6\" (UniqueName: \"kubernetes.io/projected/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-kube-api-access-94mt6\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771147 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-audit\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771162 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771176 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-machine-approver-tls\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771194 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgq6h\" (UniqueName: \"kubernetes.io/projected/25af4e9d-c029-4ee7-9952-18a3a5e3c333-kube-api-access-pgq6h\") pod \"cluster-samples-operator-665b6dd947-tz8sp\" (UID: \"25af4e9d-c029-4ee7-9952-18a3a5e3c333\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771208 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d291bc8-87c0-4a9e-b269-52a7801f050b-serving-cert\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771223 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmj6b\" (UniqueName: \"kubernetes.io/projected/b60d6f1b-b109-4fa4-a85d-ebb845b342bd-kube-api-access-zmj6b\") pod \"openshift-config-operator-7777fb866f-p4nnp\" (UID: \"b60d6f1b-b109-4fa4-a85d-ebb845b342bd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771238 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-policies\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771236 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-g9tz2"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771254 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-client-ca\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771271 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-image-import-ca\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771288 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9d291bc8-87c0-4a9e-b269-52a7801f050b-node-pullsecrets\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771303 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/93bc1508-a828-4d23-b078-1d4164d1bc2c-etcd-ca\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771319 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-serving-cert\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771339 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-config\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771354 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cae2f42f-b7c7-43c7-b397-a8273ea5844b-metrics-tls\") pod \"dns-operator-744455d44c-g2rrh\" (UID: \"cae2f42f-b7c7-43c7-b397-a8273ea5844b\") " pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771368 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e81e2384-94b0-4639-bb2d-e4152385c932-serving-cert\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771384 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-trusted-ca-bundle\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771475 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njsf7\" (UniqueName: \"kubernetes.io/projected/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-kube-api-access-njsf7\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771498 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771521 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7156267-6917-4c54-ba75-4a91a0772025-serving-cert\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771559 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9d291bc8-87c0-4a9e-b269-52a7801f050b-audit-dir\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771598 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/93bc1508-a828-4d23-b078-1d4164d1bc2c-etcd-service-ca\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771618 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlf56\" (UniqueName: \"kubernetes.io/projected/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-kube-api-access-xlf56\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771639 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-images\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771656 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp2sf\" (UniqueName: \"kubernetes.io/projected/3498feaf-72d5-471a-b25e-fb4b68875767-kube-api-access-wp2sf\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771676 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-client-ca\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771692 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7156267-6917-4c54-ba75-4a91a0772025-trusted-ca\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771706 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-config\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771729 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-oauth-serving-cert\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771745 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771761 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771777 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5ktq\" (UniqueName: \"kubernetes.io/projected/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-kube-api-access-z5ktq\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771793 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wthj5\" (UniqueName: \"kubernetes.io/projected/f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5-kube-api-access-wthj5\") pod \"openshift-apiserver-operator-796bbdcf4f-8sv4j\" (UID: \"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771808 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64b9q\" (UniqueName: \"kubernetes.io/projected/e81e2384-94b0-4639-bb2d-e4152385c932-kube-api-access-64b9q\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.771822 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.772039 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.772506 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.772519 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.772772 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.773034 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.773060 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.773556 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/67adff78-dfe5-440a-80b0-fefd703c3aa7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.773883 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-client-ca\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.774335 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67adff78-dfe5-440a-80b0-fefd703c3aa7-audit-dir\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.774455 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-images\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.775103 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-config\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.776676 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5jhkc"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.778207 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5jhkc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.778337 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/67adff78-dfe5-440a-80b0-fefd703c3aa7-encryption-config\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.779059 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.779458 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67adff78-dfe5-440a-80b0-fefd703c3aa7-serving-cert\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.780064 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.780860 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.781095 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/67adff78-dfe5-440a-80b0-fefd703c3aa7-etcd-client\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.781204 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hkg98"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.782319 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-shqhf"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.782375 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-config\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.783092 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.784084 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.792606 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-bj929"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.793230 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.794089 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3498feaf-72d5-471a-b25e-fb4b68875767-serving-cert\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.794408 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.796114 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.797374 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.798377 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.799478 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6rrrs"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.800833 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.801716 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.804297 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.805398 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.806759 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-4sj57"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.808501 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.808770 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.809131 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.810926 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ql2ps"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.812454 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567786-d8w8k"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.813145 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.814429 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8jgh7"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.815845 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.817696 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.822124 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-brnnr"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.823808 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-brnnr" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.825688 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cstqb"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.830845 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.832289 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.836368 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.837502 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.838809 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5jhkc"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.839826 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-g9tz2"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.842138 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.842278 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.842321 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.843294 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6rrrs"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.845208 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-brnnr"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.846214 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567786-d8w8k"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.847999 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-shqhf"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.849166 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.850539 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cstqb"] Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.862099 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.872325 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7156267-6917-4c54-ba75-4a91a0772025-trusted-ca\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.872498 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-config\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.872635 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.872949 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873068 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5ktq\" (UniqueName: \"kubernetes.io/projected/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-kube-api-access-z5ktq\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873225 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-oauth-serving-cert\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873319 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wthj5\" (UniqueName: \"kubernetes.io/projected/f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5-kube-api-access-wthj5\") pod \"openshift-apiserver-operator-796bbdcf4f-8sv4j\" (UID: \"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873408 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64b9q\" (UniqueName: \"kubernetes.io/projected/e81e2384-94b0-4639-bb2d-e4152385c932-kube-api-access-64b9q\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873501 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873678 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-oauth-config\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873336 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-config\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873793 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873864 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pwc7\" (UniqueName: \"kubernetes.io/projected/9d291bc8-87c0-4a9e-b269-52a7801f050b-kube-api-access-9pwc7\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873898 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8sv4j\" (UID: \"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873923 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssrbd\" (UniqueName: \"kubernetes.io/projected/cae2f42f-b7c7-43c7-b397-a8273ea5844b-kube-api-access-ssrbd\") pod \"dns-operator-744455d44c-g2rrh\" (UID: \"cae2f42f-b7c7-43c7-b397-a8273ea5844b\") " pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873966 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9d291bc8-87c0-4a9e-b269-52a7801f050b-encryption-config\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873702 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7156267-6917-4c54-ba75-4a91a0772025-trusted-ca\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.873992 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93bc1508-a828-4d23-b078-1d4164d1bc2c-serving-cert\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874051 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93bc1508-a828-4d23-b078-1d4164d1bc2c-config\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874082 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2bc52acb-29f0-4f24-a46a-928a529264dc-proxy-tls\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874120 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r66pp\" (UniqueName: \"kubernetes.io/projected/4d63cdfd-21e7-4a63-960b-363fb131ac08-kube-api-access-r66pp\") pod \"downloads-7954f5f757-qp8mz\" (UID: \"4d63cdfd-21e7-4a63-960b-363fb131ac08\") " pod="openshift-console/downloads-7954f5f757-qp8mz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874142 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-etcd-serving-ca\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874169 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-auth-proxy-config\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874195 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-config\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874239 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9d291bc8-87c0-4a9e-b269-52a7801f050b-etcd-client\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874268 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-config\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874292 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a79fc033-c671-42ff-aa06-78ae64967c92-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4cwc5\" (UID: \"a79fc033-c671-42ff-aa06-78ae64967c92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874317 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874360 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b60d6f1b-b109-4fa4-a85d-ebb845b342bd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p4nnp\" (UID: \"b60d6f1b-b109-4fa4-a85d-ebb845b342bd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874387 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2bc52acb-29f0-4f24-a46a-928a529264dc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874414 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl5rt\" (UniqueName: \"kubernetes.io/projected/2bc52acb-29f0-4f24-a46a-928a529264dc-kube-api-access-bl5rt\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874452 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b60d6f1b-b109-4fa4-a85d-ebb845b342bd-serving-cert\") pod \"openshift-config-operator-7777fb866f-p4nnp\" (UID: \"b60d6f1b-b109-4fa4-a85d-ebb845b342bd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874480 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874507 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f49362-2145-40aa-8a7c-e07c70ea910c-config\") pod \"kube-controller-manager-operator-78b949d7b-c4tgz\" (UID: \"14f49362-2145-40aa-8a7c-e07c70ea910c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874533 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874555 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874598 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874598 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8sv4j\" (UID: \"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874622 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-dir\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874646 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874671 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q47zt\" (UniqueName: \"kubernetes.io/projected/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-kube-api-access-q47zt\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874698 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8sv4j\" (UID: \"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874721 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/93bc1508-a828-4d23-b078-1d4164d1bc2c-etcd-client\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874744 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874766 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2bc52acb-29f0-4f24-a46a-928a529264dc-images\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874790 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxvqw\" (UniqueName: \"kubernetes.io/projected/93bc1508-a828-4d23-b078-1d4164d1bc2c-kube-api-access-dxvqw\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874817 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7156267-6917-4c54-ba75-4a91a0772025-config\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874851 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-service-ca-bundle\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874874 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874904 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-serving-cert\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874914 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93bc1508-a828-4d23-b078-1d4164d1bc2c-config\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874932 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a79fc033-c671-42ff-aa06-78ae64967c92-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4cwc5\" (UID: \"a79fc033-c671-42ff-aa06-78ae64967c92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874952 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874958 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25af4e9d-c029-4ee7-9952-18a3a5e3c333-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tz8sp\" (UID: \"25af4e9d-c029-4ee7-9952-18a3a5e3c333\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875016 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875049 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-service-ca\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875078 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/28599c04-0840-41a0-91dd-c0ed5bcf99fd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bxg8h\" (UID: \"28599c04-0840-41a0-91dd-c0ed5bcf99fd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875105 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14f49362-2145-40aa-8a7c-e07c70ea910c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-c4tgz\" (UID: \"14f49362-2145-40aa-8a7c-e07c70ea910c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875132 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8kjw\" (UniqueName: \"kubernetes.io/projected/a79fc033-c671-42ff-aa06-78ae64967c92-kube-api-access-s8kjw\") pod \"openshift-controller-manager-operator-756b6f6bc6-4cwc5\" (UID: \"a79fc033-c671-42ff-aa06-78ae64967c92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875154 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-config\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875177 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-config\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875200 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875222 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94mt6\" (UniqueName: \"kubernetes.io/projected/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-kube-api-access-94mt6\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875245 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7z9v\" (UniqueName: \"kubernetes.io/projected/28599c04-0840-41a0-91dd-c0ed5bcf99fd-kube-api-access-g7z9v\") pod \"package-server-manager-789f6589d5-bxg8h\" (UID: \"28599c04-0840-41a0-91dd-c0ed5bcf99fd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875287 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zmdv\" (UniqueName: \"kubernetes.io/projected/f7156267-6917-4c54-ba75-4a91a0772025-kube-api-access-5zmdv\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875311 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-machine-approver-tls\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875317 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-etcd-serving-ca\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875336 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgq6h\" (UniqueName: \"kubernetes.io/projected/25af4e9d-c029-4ee7-9952-18a3a5e3c333-kube-api-access-pgq6h\") pod \"cluster-samples-operator-665b6dd947-tz8sp\" (UID: \"25af4e9d-c029-4ee7-9952-18a3a5e3c333\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875360 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-audit\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875382 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875404 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-policies\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875425 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-client-ca\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875441 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-image-import-ca\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875457 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d291bc8-87c0-4a9e-b269-52a7801f050b-serving-cert\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875474 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmj6b\" (UniqueName: \"kubernetes.io/projected/b60d6f1b-b109-4fa4-a85d-ebb845b342bd-kube-api-access-zmj6b\") pod \"openshift-config-operator-7777fb866f-p4nnp\" (UID: \"b60d6f1b-b109-4fa4-a85d-ebb845b342bd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875490 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-serving-cert\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875508 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9d291bc8-87c0-4a9e-b269-52a7801f050b-node-pullsecrets\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875523 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/93bc1508-a828-4d23-b078-1d4164d1bc2c-etcd-ca\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875545 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e81e2384-94b0-4639-bb2d-e4152385c932-serving-cert\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875595 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cae2f42f-b7c7-43c7-b397-a8273ea5844b-metrics-tls\") pod \"dns-operator-744455d44c-g2rrh\" (UID: \"cae2f42f-b7c7-43c7-b397-a8273ea5844b\") " pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875618 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njsf7\" (UniqueName: \"kubernetes.io/projected/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-kube-api-access-njsf7\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875641 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875665 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-trusted-ca-bundle\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875698 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7156267-6917-4c54-ba75-4a91a0772025-serving-cert\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875726 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlf56\" (UniqueName: \"kubernetes.io/projected/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-kube-api-access-xlf56\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875749 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14f49362-2145-40aa-8a7c-e07c70ea910c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-c4tgz\" (UID: \"14f49362-2145-40aa-8a7c-e07c70ea910c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875779 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9d291bc8-87c0-4a9e-b269-52a7801f050b-audit-dir\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875799 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/93bc1508-a828-4d23-b078-1d4164d1bc2c-etcd-service-ca\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.875915 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-config\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.876772 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-auth-proxy-config\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.874905 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b60d6f1b-b109-4fa4-a85d-ebb845b342bd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p4nnp\" (UID: \"b60d6f1b-b109-4fa4-a85d-ebb845b342bd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.877154 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-dir\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.877442 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9d291bc8-87c0-4a9e-b269-52a7801f050b-encryption-config\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.877456 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-oauth-serving-cert\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.877631 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.877948 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9d291bc8-87c0-4a9e-b269-52a7801f050b-etcd-client\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.878034 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/93bc1508-a828-4d23-b078-1d4164d1bc2c-etcd-service-ca\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.878444 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7156267-6917-4c54-ba75-4a91a0772025-config\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.878665 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.879369 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9d291bc8-87c0-4a9e-b269-52a7801f050b-audit-dir\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.879440 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9d291bc8-87c0-4a9e-b269-52a7801f050b-node-pullsecrets\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.879668 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-service-ca\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.879795 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.879812 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.879875 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-config\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.880096 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.880207 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25af4e9d-c029-4ee7-9952-18a3a5e3c333-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tz8sp\" (UID: \"25af4e9d-c029-4ee7-9952-18a3a5e3c333\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.880302 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-client-ca\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.880497 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-service-ca-bundle\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.880873 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93bc1508-a828-4d23-b078-1d4164d1bc2c-serving-cert\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.881144 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-config\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.881271 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-audit\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.881404 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-config\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.881434 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-policies\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.881687 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.881859 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.881955 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/9d291bc8-87c0-4a9e-b269-52a7801f050b-image-import-ca\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.882060 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/93bc1508-a828-4d23-b078-1d4164d1bc2c-etcd-client\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.882651 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-trusted-ca-bundle\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.883631 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.883695 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-oauth-config\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.884157 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/93bc1508-a828-4d23-b078-1d4164d1bc2c-etcd-ca\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.885201 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7156267-6917-4c54-ba75-4a91a0772025-serving-cert\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.885360 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-machine-approver-tls\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.885628 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b60d6f1b-b109-4fa4-a85d-ebb845b342bd-serving-cert\") pod \"openshift-config-operator-7777fb866f-p4nnp\" (UID: \"b60d6f1b-b109-4fa4-a85d-ebb845b342bd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.886339 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8sv4j\" (UID: \"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.886530 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.886663 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.887641 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.889620 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.889898 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.889958 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a79fc033-c671-42ff-aa06-78ae64967c92-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4cwc5\" (UID: \"a79fc033-c671-42ff-aa06-78ae64967c92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.891264 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cae2f42f-b7c7-43c7-b397-a8273ea5844b-metrics-tls\") pod \"dns-operator-744455d44c-g2rrh\" (UID: \"cae2f42f-b7c7-43c7-b397-a8273ea5844b\") " pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.892166 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e81e2384-94b0-4639-bb2d-e4152385c932-serving-cert\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.892168 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d291bc8-87c0-4a9e-b269-52a7801f050b-serving-cert\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.894223 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-serving-cert\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.896056 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-serving-cert\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.902740 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.907414 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a79fc033-c671-42ff-aa06-78ae64967c92-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4cwc5\" (UID: \"a79fc033-c671-42ff-aa06-78ae64967c92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.922591 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.949717 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.960490 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.962738 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.976636 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/28599c04-0840-41a0-91dd-c0ed5bcf99fd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bxg8h\" (UID: \"28599c04-0840-41a0-91dd-c0ed5bcf99fd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.976694 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14f49362-2145-40aa-8a7c-e07c70ea910c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-c4tgz\" (UID: \"14f49362-2145-40aa-8a7c-e07c70ea910c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.976742 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7z9v\" (UniqueName: \"kubernetes.io/projected/28599c04-0840-41a0-91dd-c0ed5bcf99fd-kube-api-access-g7z9v\") pod \"package-server-manager-789f6589d5-bxg8h\" (UID: \"28599c04-0840-41a0-91dd-c0ed5bcf99fd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.976819 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14f49362-2145-40aa-8a7c-e07c70ea910c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-c4tgz\" (UID: \"14f49362-2145-40aa-8a7c-e07c70ea910c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.976886 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2bc52acb-29f0-4f24-a46a-928a529264dc-proxy-tls\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.976980 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2bc52acb-29f0-4f24-a46a-928a529264dc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.977006 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl5rt\" (UniqueName: \"kubernetes.io/projected/2bc52acb-29f0-4f24-a46a-928a529264dc-kube-api-access-bl5rt\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.977027 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f49362-2145-40aa-8a7c-e07c70ea910c-config\") pod \"kube-controller-manager-operator-78b949d7b-c4tgz\" (UID: \"14f49362-2145-40aa-8a7c-e07c70ea910c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.977060 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2bc52acb-29f0-4f24-a46a-928a529264dc-images\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.977773 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2bc52acb-29f0-4f24-a46a-928a529264dc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.982692 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 21 04:26:57 crc kubenswrapper[4839]: I0321 04:26:57.990011 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.002312 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.021912 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.043235 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.062375 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.083338 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.102644 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.123536 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.131224 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14f49362-2145-40aa-8a7c-e07c70ea910c-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-c4tgz\" (UID: \"14f49362-2145-40aa-8a7c-e07c70ea910c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.142132 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.148784 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14f49362-2145-40aa-8a7c-e07c70ea910c-config\") pod \"kube-controller-manager-operator-78b949d7b-c4tgz\" (UID: \"14f49362-2145-40aa-8a7c-e07c70ea910c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.162210 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.182899 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.201907 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.229213 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.242300 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.262868 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.283164 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.303076 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.323311 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.342809 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.362915 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.382927 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.403385 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.422615 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.443758 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.463243 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.468163 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2bc52acb-29f0-4f24-a46a-928a529264dc-images\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.482812 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.503755 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.511604 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2bc52acb-29f0-4f24-a46a-928a529264dc-proxy-tls\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.523781 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.544521 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.563849 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.583410 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.603346 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.623025 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.644238 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.663803 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.684072 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.702978 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.723658 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.743067 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.760482 4839 request.go:700] Waited for 1.009015012s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/secrets?fieldSelector=metadata.name%3Drouter-stats-default&limit=500&resourceVersion=0 Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.762160 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.783351 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.803389 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.823306 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.842365 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.863586 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.882961 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.910386 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.923326 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.943359 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.953721 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/28599c04-0840-41a0-91dd-c0ed5bcf99fd-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bxg8h\" (UID: \"28599c04-0840-41a0-91dd-c0ed5bcf99fd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.963151 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 21 04:26:58 crc kubenswrapper[4839]: I0321 04:26:58.983115 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.004374 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.038990 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29czp\" (UniqueName: \"kubernetes.io/projected/c4d393d7-42d7-4b7d-a3cd-f7e325b97c54-kube-api-access-29czp\") pod \"machine-api-operator-5694c8668f-nmj8p\" (UID: \"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.063280 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.068169 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2bf4\" (UniqueName: \"kubernetes.io/projected/67adff78-dfe5-440a-80b0-fefd703c3aa7-kube-api-access-f2bf4\") pod \"apiserver-7bbb656c7d-85pc8\" (UID: \"67adff78-dfe5-440a-80b0-fefd703c3aa7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.082600 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.103290 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.117750 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.123508 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.143345 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.168110 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.179519 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp2sf\" (UniqueName: \"kubernetes.io/projected/3498feaf-72d5-471a-b25e-fb4b68875767-kube-api-access-wp2sf\") pod \"controller-manager-879f6c89f-45jfn\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.182497 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.202833 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.223235 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.242776 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.265239 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.283141 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.303846 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.323248 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.342449 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.383407 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.402055 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.402266 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.408630 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nmj8p"] Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.409906 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8"] Mar 21 04:26:59 crc kubenswrapper[4839]: W0321 04:26:59.416767 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67adff78_dfe5_440a_80b0_fefd703c3aa7.slice/crio-81b4fc74d4bcb1143af960c81d13c8b2beb6b1e43a405460cfe879f905bc17cf WatchSource:0}: Error finding container 81b4fc74d4bcb1143af960c81d13c8b2beb6b1e43a405460cfe879f905bc17cf: Status 404 returned error can't find the container with id 81b4fc74d4bcb1143af960c81d13c8b2beb6b1e43a405460cfe879f905bc17cf Mar 21 04:26:59 crc kubenswrapper[4839]: W0321 04:26:59.417110 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4d393d7_42d7_4b7d_a3cd_f7e325b97c54.slice/crio-c187917beb001e03623ae690390f9e84833b05f6d8e76c1a87f8c27bfd7ec465 WatchSource:0}: Error finding container c187917beb001e03623ae690390f9e84833b05f6d8e76c1a87f8c27bfd7ec465: Status 404 returned error can't find the container with id c187917beb001e03623ae690390f9e84833b05f6d8e76c1a87f8c27bfd7ec465 Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.423008 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.443127 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.461983 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.485376 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.503657 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.521155 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" event={"ID":"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54","Type":"ContainerStarted","Data":"c187917beb001e03623ae690390f9e84833b05f6d8e76c1a87f8c27bfd7ec465"} Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.522381 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" event={"ID":"67adff78-dfe5-440a-80b0-fefd703c3aa7","Type":"ContainerStarted","Data":"81b4fc74d4bcb1143af960c81d13c8b2beb6b1e43a405460cfe879f905bc17cf"} Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.523975 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.543366 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.564209 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.564385 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-45jfn"] Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.582295 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.602776 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.623034 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.643121 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.663009 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.682622 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.723365 4839 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.742482 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.761134 4839 request.go:700] Waited for 1.926755326s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.763190 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.801711 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5ktq\" (UniqueName: \"kubernetes.io/projected/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-kube-api-access-z5ktq\") pod \"oauth-openshift-558db77b4-zt77f\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.816732 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wthj5\" (UniqueName: \"kubernetes.io/projected/f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5-kube-api-access-wthj5\") pod \"openshift-apiserver-operator-796bbdcf4f-8sv4j\" (UID: \"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.841533 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64b9q\" (UniqueName: \"kubernetes.io/projected/e81e2384-94b0-4639-bb2d-e4152385c932-kube-api-access-64b9q\") pod \"route-controller-manager-6576b87f9c-76ctz\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.841858 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.855707 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pwc7\" (UniqueName: \"kubernetes.io/projected/9d291bc8-87c0-4a9e-b269-52a7801f050b-kube-api-access-9pwc7\") pod \"apiserver-76f77b778f-gl7rc\" (UID: \"9d291bc8-87c0-4a9e-b269-52a7801f050b\") " pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.860656 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.867097 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.881505 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssrbd\" (UniqueName: \"kubernetes.io/projected/cae2f42f-b7c7-43c7-b397-a8273ea5844b-kube-api-access-ssrbd\") pod \"dns-operator-744455d44c-g2rrh\" (UID: \"cae2f42f-b7c7-43c7-b397-a8273ea5844b\") " pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.896969 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r66pp\" (UniqueName: \"kubernetes.io/projected/4d63cdfd-21e7-4a63-960b-363fb131ac08-kube-api-access-r66pp\") pod \"downloads-7954f5f757-qp8mz\" (UID: \"4d63cdfd-21e7-4a63-960b-363fb131ac08\") " pod="openshift-console/downloads-7954f5f757-qp8mz" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.921602 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.930604 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qp8mz" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.952805 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q47zt\" (UniqueName: \"kubernetes.io/projected/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-kube-api-access-q47zt\") pod \"console-f9d7485db-bj929\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.955960 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.963189 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxvqw\" (UniqueName: \"kubernetes.io/projected/93bc1508-a828-4d23-b078-1d4164d1bc2c-kube-api-access-dxvqw\") pod \"etcd-operator-b45778765-2s6j7\" (UID: \"93bc1508-a828-4d23-b078-1d4164d1bc2c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:26:59 crc kubenswrapper[4839]: I0321 04:26:59.992296 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94mt6\" (UniqueName: \"kubernetes.io/projected/c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf-kube-api-access-94mt6\") pod \"authentication-operator-69f744f599-k5nwf\" (UID: \"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.009038 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8kjw\" (UniqueName: \"kubernetes.io/projected/a79fc033-c671-42ff-aa06-78ae64967c92-kube-api-access-s8kjw\") pod \"openshift-controller-manager-operator-756b6f6bc6-4cwc5\" (UID: \"a79fc033-c671-42ff-aa06-78ae64967c92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.031801 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlf56\" (UniqueName: \"kubernetes.io/projected/14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1-kube-api-access-xlf56\") pod \"machine-approver-56656f9798-fh8k5\" (UID: \"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.036332 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njsf7\" (UniqueName: \"kubernetes.io/projected/3db892a0-fb40-4e0e-93ee-a8f2876ad8be-kube-api-access-njsf7\") pod \"cluster-image-registry-operator-dc59b4c8b-fhqz6\" (UID: \"3db892a0-fb40-4e0e-93ee-a8f2876ad8be\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.065274 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zmdv\" (UniqueName: \"kubernetes.io/projected/f7156267-6917-4c54-ba75-4a91a0772025-kube-api-access-5zmdv\") pod \"console-operator-58897d9998-hkg98\" (UID: \"f7156267-6917-4c54-ba75-4a91a0772025\") " pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.083680 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.085679 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgq6h\" (UniqueName: \"kubernetes.io/projected/25af4e9d-c029-4ee7-9952-18a3a5e3c333-kube-api-access-pgq6h\") pod \"cluster-samples-operator-665b6dd947-tz8sp\" (UID: \"25af4e9d-c029-4ee7-9952-18a3a5e3c333\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.087071 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zt77f"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.097599 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.107125 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmj6b\" (UniqueName: \"kubernetes.io/projected/b60d6f1b-b109-4fa4-a85d-ebb845b342bd-kube-api-access-zmj6b\") pod \"openshift-config-operator-7777fb866f-p4nnp\" (UID: \"b60d6f1b-b109-4fa4-a85d-ebb845b342bd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.116980 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.123640 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.126465 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gl7rc"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.128523 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7z9v\" (UniqueName: \"kubernetes.io/projected/28599c04-0840-41a0-91dd-c0ed5bcf99fd-kube-api-access-g7z9v\") pod \"package-server-manager-789f6589d5-bxg8h\" (UID: \"28599c04-0840-41a0-91dd-c0ed5bcf99fd\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.133103 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.137198 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14f49362-2145-40aa-8a7c-e07c70ea910c-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-c4tgz\" (UID: \"14f49362-2145-40aa-8a7c-e07c70ea910c\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.152955 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.162886 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl5rt\" (UniqueName: \"kubernetes.io/projected/2bc52acb-29f0-4f24-a46a-928a529264dc-kube-api-access-bl5rt\") pod \"machine-config-operator-74547568cd-lqm8j\" (UID: \"2bc52acb-29f0-4f24-a46a-928a529264dc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.174897 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qp8mz"] Mar 21 04:27:00 crc kubenswrapper[4839]: W0321 04:27:00.186898 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d63cdfd_21e7_4a63_960b_363fb131ac08.slice/crio-3f550c2c325d41fb1c4343671414703435e40bc13120f59039726685c708adaf WatchSource:0}: Error finding container 3f550c2c325d41fb1c4343671414703435e40bc13120f59039726685c708adaf: Status 404 returned error can't find the container with id 3f550c2c325d41fb1c4343671414703435e40bc13120f59039726685c708adaf Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.203854 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.212405 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.215602 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10dc7791-eebd-49e9-8d9c-63711119e9d7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hmqv7\" (UID: \"10dc7791-eebd-49e9-8d9c-63711119e9d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.215673 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-bound-sa-token\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.215702 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10dc7791-eebd-49e9-8d9c-63711119e9d7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hmqv7\" (UID: \"10dc7791-eebd-49e9-8d9c-63711119e9d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.215746 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-trusted-ca\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.215764 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81a76ad1-da33-4b42-9c0a-d0ada077729a-config-volume\") pod \"dns-default-5jhkc\" (UID: \"81a76ad1-da33-4b42-9c0a-d0ada077729a\") " pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.215799 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ce563b-8e5b-4abe-b71b-02c588bff511-service-ca-bundle\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.215815 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbz79\" (UniqueName: \"kubernetes.io/projected/7f2c6e22-6a88-4c63-9da2-e38b813e0f1c-kube-api-access-dbz79\") pod \"migrator-59844c95c7-blcpt\" (UID: \"7f2c6e22-6a88-4c63-9da2-e38b813e0f1c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.215845 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/745f7801-7150-4924-b9fb-e8a0aa1e7edb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-djrdt\" (UID: \"745f7801-7150-4924-b9fb-e8a0aa1e7edb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.215860 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-627w7\" (UniqueName: \"kubernetes.io/projected/28ce563b-8e5b-4abe-b71b-02c588bff511-kube-api-access-627w7\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.215894 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/40014780-8cb8-47fa-8b2c-c4fb7d04a85c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-whlp9\" (UID: \"40014780-8cb8-47fa-8b2c-c4fb7d04a85c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216071 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8jgh7\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216104 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhhkt\" (UniqueName: \"kubernetes.io/projected/a810b51a-5b19-4da9-ad80-05f189d821e4-kube-api-access-zhhkt\") pod \"machine-config-controller-84d6567774-hlk25\" (UID: \"a810b51a-5b19-4da9-ad80-05f189d821e4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216131 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmtdf\" (UniqueName: \"kubernetes.io/projected/8c8a6e75-7e5f-41c8-8312-b9d274284f35-kube-api-access-vmtdf\") pod \"multus-admission-controller-857f4d67dd-g9tz2\" (UID: \"8c8a6e75-7e5f-41c8-8312-b9d274284f35\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216173 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/745f7801-7150-4924-b9fb-e8a0aa1e7edb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-djrdt\" (UID: \"745f7801-7150-4924-b9fb-e8a0aa1e7edb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216197 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a83789bf-1523-4d5e-892d-6597aed01b7d-srv-cert\") pod \"catalog-operator-68c6474976-xldvn\" (UID: \"a83789bf-1523-4d5e-892d-6597aed01b7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216303 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2bceecf8-583d-4e26-9749-f5939280540b-tmpfs\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216351 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/28ce563b-8e5b-4abe-b71b-02c588bff511-stats-auth\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216375 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2bceecf8-583d-4e26-9749-f5939280540b-webhook-cert\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216403 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a76ad1-da33-4b42-9c0a-d0ada077729a-metrics-tls\") pod \"dns-default-5jhkc\" (UID: \"81a76ad1-da33-4b42-9c0a-d0ada077729a\") " pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216432 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-metrics-tls\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216492 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lg7c\" (UniqueName: \"kubernetes.io/projected/34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9-kube-api-access-9lg7c\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2g8w\" (UID: \"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216541 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2g8w\" (UID: \"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216743 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fjmf\" (UniqueName: \"kubernetes.io/projected/a83789bf-1523-4d5e-892d-6597aed01b7d-kube-api-access-7fjmf\") pod \"catalog-operator-68c6474976-xldvn\" (UID: \"a83789bf-1523-4d5e-892d-6597aed01b7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216777 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6s9q\" (UniqueName: \"kubernetes.io/projected/6dd3a400-6155-44b9-a358-d2cd089db1f6-kube-api-access-d6s9q\") pod \"service-ca-operator-777779d784-shqhf\" (UID: \"6dd3a400-6155-44b9-a358-d2cd089db1f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216823 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/28ce563b-8e5b-4abe-b71b-02c588bff511-default-certificate\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216885 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/745f7801-7150-4924-b9fb-e8a0aa1e7edb-config\") pod \"kube-apiserver-operator-766d6c64bb-djrdt\" (UID: \"745f7801-7150-4924-b9fb-e8a0aa1e7edb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216915 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtldg\" (UniqueName: \"kubernetes.io/projected/2bceecf8-583d-4e26-9749-f5939280540b-kube-api-access-jtldg\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216932 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd3a400-6155-44b9-a358-d2cd089db1f6-config\") pod \"service-ca-operator-777779d784-shqhf\" (UID: \"6dd3a400-6155-44b9-a358-d2cd089db1f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216969 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2g8w\" (UID: \"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.216985 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75c2v\" (UniqueName: \"kubernetes.io/projected/685c3b51-a70f-484e-b7db-f98383f75003-kube-api-access-75c2v\") pod \"olm-operator-6b444d44fb-78xr9\" (UID: \"685c3b51-a70f-484e-b7db-f98383f75003\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.217006 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10dc7791-eebd-49e9-8d9c-63711119e9d7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hmqv7\" (UID: \"10dc7791-eebd-49e9-8d9c-63711119e9d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.217184 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dd3a400-6155-44b9-a358-d2cd089db1f6-serving-cert\") pod \"service-ca-operator-777779d784-shqhf\" (UID: \"6dd3a400-6155-44b9-a358-d2cd089db1f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.217242 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfdvw\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-kube-api-access-pfdvw\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.217293 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8jgh7\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.217327 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/685c3b51-a70f-484e-b7db-f98383f75003-srv-cert\") pod \"olm-operator-6b444d44fb-78xr9\" (UID: \"685c3b51-a70f-484e-b7db-f98383f75003\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.217363 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.217387 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2bceecf8-583d-4e26-9749-f5939280540b-apiservice-cert\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.217463 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a810b51a-5b19-4da9-ad80-05f189d821e4-proxy-tls\") pod \"machine-config-controller-84d6567774-hlk25\" (UID: \"a810b51a-5b19-4da9-ad80-05f189d821e4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.217492 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8c8a6e75-7e5f-41c8-8312-b9d274284f35-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-g9tz2\" (UID: \"8c8a6e75-7e5f-41c8-8312-b9d274284f35\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.217523 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.219915 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:00.719896838 +0000 UTC m=+225.047683514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220337 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a810b51a-5b19-4da9-ad80-05f189d821e4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hlk25\" (UID: \"a810b51a-5b19-4da9-ad80-05f189d821e4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220490 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v69rn\" (UniqueName: \"kubernetes.io/projected/6240548e-b827-4fdb-b2be-c7187d6a28e8-kube-api-access-v69rn\") pod \"marketplace-operator-79b997595-8jgh7\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220541 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28ce563b-8e5b-4abe-b71b-02c588bff511-metrics-certs\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220589 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-tls\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220615 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7ef3f28d-e496-434e-a803-3b9a0fa24690-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220724 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-certificates\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220770 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6nxn\" (UniqueName: \"kubernetes.io/projected/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-kube-api-access-t6nxn\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220821 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m9qn\" (UniqueName: \"kubernetes.io/projected/40014780-8cb8-47fa-8b2c-c4fb7d04a85c-kube-api-access-8m9qn\") pod \"control-plane-machine-set-operator-78cbb6b69f-whlp9\" (UID: \"40014780-8cb8-47fa-8b2c-c4fb7d04a85c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220878 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7ef3f28d-e496-434e-a803-3b9a0fa24690-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220902 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/685c3b51-a70f-484e-b7db-f98383f75003-profile-collector-cert\") pod \"olm-operator-6b444d44fb-78xr9\" (UID: \"685c3b51-a70f-484e-b7db-f98383f75003\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.220927 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a83789bf-1523-4d5e-892d-6597aed01b7d-profile-collector-cert\") pod \"catalog-operator-68c6474976-xldvn\" (UID: \"a83789bf-1523-4d5e-892d-6597aed01b7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.221550 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmkvx\" (UniqueName: \"kubernetes.io/projected/81a76ad1-da33-4b42-9c0a-d0ada077729a-kube-api-access-rmkvx\") pod \"dns-default-5jhkc\" (UID: \"81a76ad1-da33-4b42-9c0a-d0ada077729a\") " pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.221685 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-trusted-ca\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.245869 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.263047 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.275506 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.323948 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.324361 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.324454 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:00.824433914 +0000 UTC m=+225.152220580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.324626 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/28ce563b-8e5b-4abe-b71b-02c588bff511-stats-auth\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.324653 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2bceecf8-583d-4e26-9749-f5939280540b-webhook-cert\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.324700 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a76ad1-da33-4b42-9c0a-d0ada077729a-metrics-tls\") pod \"dns-default-5jhkc\" (UID: \"81a76ad1-da33-4b42-9c0a-d0ada077729a\") " pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.325679 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-metrics-tls\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326305 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lg7c\" (UniqueName: \"kubernetes.io/projected/34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9-kube-api-access-9lg7c\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2g8w\" (UID: \"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326329 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ad426123-af7f-45c4-8a6b-bca3c83017be-signing-key\") pod \"service-ca-9c57cc56f-6rrrs\" (UID: \"ad426123-af7f-45c4-8a6b-bca3c83017be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326349 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a178972b-b463-42db-b2c9-dcba9a51c4bc-certs\") pod \"machine-config-server-4sj57\" (UID: \"a178972b-b463-42db-b2c9-dcba9a51c4bc\") " pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326402 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2g8w\" (UID: \"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326422 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fjmf\" (UniqueName: \"kubernetes.io/projected/a83789bf-1523-4d5e-892d-6597aed01b7d-kube-api-access-7fjmf\") pod \"catalog-operator-68c6474976-xldvn\" (UID: \"a83789bf-1523-4d5e-892d-6597aed01b7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326461 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/28ce563b-8e5b-4abe-b71b-02c588bff511-default-certificate\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326477 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6s9q\" (UniqueName: \"kubernetes.io/projected/6dd3a400-6155-44b9-a358-d2cd089db1f6-kube-api-access-d6s9q\") pod \"service-ca-operator-777779d784-shqhf\" (UID: \"6dd3a400-6155-44b9-a358-d2cd089db1f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326495 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/745f7801-7150-4924-b9fb-e8a0aa1e7edb-config\") pod \"kube-apiserver-operator-766d6c64bb-djrdt\" (UID: \"745f7801-7150-4924-b9fb-e8a0aa1e7edb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326512 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtldg\" (UniqueName: \"kubernetes.io/projected/2bceecf8-583d-4e26-9749-f5939280540b-kube-api-access-jtldg\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326529 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2g8w\" (UID: \"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326544 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd3a400-6155-44b9-a358-d2cd089db1f6-config\") pod \"service-ca-operator-777779d784-shqhf\" (UID: \"6dd3a400-6155-44b9-a358-d2cd089db1f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326559 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a178972b-b463-42db-b2c9-dcba9a51c4bc-node-bootstrap-token\") pod \"machine-config-server-4sj57\" (UID: \"a178972b-b463-42db-b2c9-dcba9a51c4bc\") " pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326601 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75c2v\" (UniqueName: \"kubernetes.io/projected/685c3b51-a70f-484e-b7db-f98383f75003-kube-api-access-75c2v\") pod \"olm-operator-6b444d44fb-78xr9\" (UID: \"685c3b51-a70f-484e-b7db-f98383f75003\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326619 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10dc7791-eebd-49e9-8d9c-63711119e9d7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hmqv7\" (UID: \"10dc7791-eebd-49e9-8d9c-63711119e9d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326663 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0368223e-2e01-4681-a7a6-67b77387f8d8-secret-volume\") pod \"collect-profiles-29567775-lfv48\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326679 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-csi-data-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326697 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dd3a400-6155-44b9-a358-d2cd089db1f6-serving-cert\") pod \"service-ca-operator-777779d784-shqhf\" (UID: \"6dd3a400-6155-44b9-a358-d2cd089db1f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326715 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1439545-f492-4e4c-858c-ec85c5c2a9d9-cert\") pod \"ingress-canary-brnnr\" (UID: \"d1439545-f492-4e4c-858c-ec85c5c2a9d9\") " pod="openshift-ingress-canary/ingress-canary-brnnr" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326735 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfdvw\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-kube-api-access-pfdvw\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326755 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8jgh7\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326780 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/685c3b51-a70f-484e-b7db-f98383f75003-srv-cert\") pod \"olm-operator-6b444d44fb-78xr9\" (UID: \"685c3b51-a70f-484e-b7db-f98383f75003\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326816 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326839 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2bceecf8-583d-4e26-9749-f5939280540b-apiservice-cert\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326863 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a810b51a-5b19-4da9-ad80-05f189d821e4-proxy-tls\") pod \"machine-config-controller-84d6567774-hlk25\" (UID: \"a810b51a-5b19-4da9-ad80-05f189d821e4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326884 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8c8a6e75-7e5f-41c8-8312-b9d274284f35-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-g9tz2\" (UID: \"8c8a6e75-7e5f-41c8-8312-b9d274284f35\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.326898 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327010 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a810b51a-5b19-4da9-ad80-05f189d821e4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hlk25\" (UID: \"a810b51a-5b19-4da9-ad80-05f189d821e4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327032 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jr79\" (UniqueName: \"kubernetes.io/projected/ad426123-af7f-45c4-8a6b-bca3c83017be-kube-api-access-2jr79\") pod \"service-ca-9c57cc56f-6rrrs\" (UID: \"ad426123-af7f-45c4-8a6b-bca3c83017be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327058 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-plugins-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327084 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v69rn\" (UniqueName: \"kubernetes.io/projected/6240548e-b827-4fdb-b2be-c7187d6a28e8-kube-api-access-v69rn\") pod \"marketplace-operator-79b997595-8jgh7\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327107 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28ce563b-8e5b-4abe-b71b-02c588bff511-metrics-certs\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327122 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26gxn\" (UniqueName: \"kubernetes.io/projected/a178972b-b463-42db-b2c9-dcba9a51c4bc-kube-api-access-26gxn\") pod \"machine-config-server-4sj57\" (UID: \"a178972b-b463-42db-b2c9-dcba9a51c4bc\") " pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327139 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-tls\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327157 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tf6t\" (UniqueName: \"kubernetes.io/projected/d1439545-f492-4e4c-858c-ec85c5c2a9d9-kube-api-access-8tf6t\") pod \"ingress-canary-brnnr\" (UID: \"d1439545-f492-4e4c-858c-ec85c5c2a9d9\") " pod="openshift-ingress-canary/ingress-canary-brnnr" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327174 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-certificates\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327189 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7ef3f28d-e496-434e-a803-3b9a0fa24690-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327204 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0368223e-2e01-4681-a7a6-67b77387f8d8-config-volume\") pod \"collect-profiles-29567775-lfv48\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327219 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9wch\" (UniqueName: \"kubernetes.io/projected/0368223e-2e01-4681-a7a6-67b77387f8d8-kube-api-access-l9wch\") pod \"collect-profiles-29567775-lfv48\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327234 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6nxn\" (UniqueName: \"kubernetes.io/projected/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-kube-api-access-t6nxn\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327251 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7ef3f28d-e496-434e-a803-3b9a0fa24690-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327266 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m9qn\" (UniqueName: \"kubernetes.io/projected/40014780-8cb8-47fa-8b2c-c4fb7d04a85c-kube-api-access-8m9qn\") pod \"control-plane-machine-set-operator-78cbb6b69f-whlp9\" (UID: \"40014780-8cb8-47fa-8b2c-c4fb7d04a85c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327284 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/685c3b51-a70f-484e-b7db-f98383f75003-profile-collector-cert\") pod \"olm-operator-6b444d44fb-78xr9\" (UID: \"685c3b51-a70f-484e-b7db-f98383f75003\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327470 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a83789bf-1523-4d5e-892d-6597aed01b7d-profile-collector-cert\") pod \"catalog-operator-68c6474976-xldvn\" (UID: \"a83789bf-1523-4d5e-892d-6597aed01b7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327507 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmkvx\" (UniqueName: \"kubernetes.io/projected/81a76ad1-da33-4b42-9c0a-d0ada077729a-kube-api-access-rmkvx\") pod \"dns-default-5jhkc\" (UID: \"81a76ad1-da33-4b42-9c0a-d0ada077729a\") " pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327560 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-trusted-ca\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327628 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10dc7791-eebd-49e9-8d9c-63711119e9d7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hmqv7\" (UID: \"10dc7791-eebd-49e9-8d9c-63711119e9d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327657 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-registration-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327695 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10dc7791-eebd-49e9-8d9c-63711119e9d7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hmqv7\" (UID: \"10dc7791-eebd-49e9-8d9c-63711119e9d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327709 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-bound-sa-token\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327723 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-trusted-ca\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327738 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjkdf\" (UniqueName: \"kubernetes.io/projected/609ace61-45d1-44f6-b378-fb97eecf2374-kube-api-access-vjkdf\") pod \"auto-csr-approver-29567786-d8w8k\" (UID: \"609ace61-45d1-44f6-b378-fb97eecf2374\") " pod="openshift-infra/auto-csr-approver-29567786-d8w8k" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327762 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ce563b-8e5b-4abe-b71b-02c588bff511-service-ca-bundle\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327779 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbz79\" (UniqueName: \"kubernetes.io/projected/7f2c6e22-6a88-4c63-9da2-e38b813e0f1c-kube-api-access-dbz79\") pod \"migrator-59844c95c7-blcpt\" (UID: \"7f2c6e22-6a88-4c63-9da2-e38b813e0f1c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327793 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81a76ad1-da33-4b42-9c0a-d0ada077729a-config-volume\") pod \"dns-default-5jhkc\" (UID: \"81a76ad1-da33-4b42-9c0a-d0ada077729a\") " pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327812 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/745f7801-7150-4924-b9fb-e8a0aa1e7edb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-djrdt\" (UID: \"745f7801-7150-4924-b9fb-e8a0aa1e7edb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327833 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ad426123-af7f-45c4-8a6b-bca3c83017be-signing-cabundle\") pod \"service-ca-9c57cc56f-6rrrs\" (UID: \"ad426123-af7f-45c4-8a6b-bca3c83017be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327857 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-627w7\" (UniqueName: \"kubernetes.io/projected/28ce563b-8e5b-4abe-b71b-02c588bff511-kube-api-access-627w7\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327877 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-socket-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327912 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/40014780-8cb8-47fa-8b2c-c4fb7d04a85c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-whlp9\" (UID: \"40014780-8cb8-47fa-8b2c-c4fb7d04a85c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327941 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-mountpoint-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327966 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8jgh7\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.327983 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhhkt\" (UniqueName: \"kubernetes.io/projected/a810b51a-5b19-4da9-ad80-05f189d821e4-kube-api-access-zhhkt\") pod \"machine-config-controller-84d6567774-hlk25\" (UID: \"a810b51a-5b19-4da9-ad80-05f189d821e4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.328002 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmtdf\" (UniqueName: \"kubernetes.io/projected/8c8a6e75-7e5f-41c8-8312-b9d274284f35-kube-api-access-vmtdf\") pod \"multus-admission-controller-857f4d67dd-g9tz2\" (UID: \"8c8a6e75-7e5f-41c8-8312-b9d274284f35\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.328021 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/745f7801-7150-4924-b9fb-e8a0aa1e7edb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-djrdt\" (UID: \"745f7801-7150-4924-b9fb-e8a0aa1e7edb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.328037 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a83789bf-1523-4d5e-892d-6597aed01b7d-srv-cert\") pod \"catalog-operator-68c6474976-xldvn\" (UID: \"a83789bf-1523-4d5e-892d-6597aed01b7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.328062 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vffpc\" (UniqueName: \"kubernetes.io/projected/4fee5524-9cb1-48c7-83b6-10bf3230c783-kube-api-access-vffpc\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.328088 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2bceecf8-583d-4e26-9749-f5939280540b-tmpfs\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.328725 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/2bceecf8-583d-4e26-9749-f5939280540b-tmpfs\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.331757 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2bceecf8-583d-4e26-9749-f5939280540b-webhook-cert\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.332205 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/28ce563b-8e5b-4abe-b71b-02c588bff511-stats-auth\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.334019 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-trusted-ca\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.336878 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-metrics-tls\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.337174 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a76ad1-da33-4b42-9c0a-d0ada077729a-metrics-tls\") pod \"dns-default-5jhkc\" (UID: \"81a76ad1-da33-4b42-9c0a-d0ada077729a\") " pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.337392 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-certificates\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.337878 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ce563b-8e5b-4abe-b71b-02c588bff511-service-ca-bundle\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.339524 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2g8w\" (UID: \"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.340177 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a810b51a-5b19-4da9-ad80-05f189d821e4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hlk25\" (UID: \"a810b51a-5b19-4da9-ad80-05f189d821e4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.341179 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8jgh7\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.344448 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/28ce563b-8e5b-4abe-b71b-02c588bff511-default-certificate\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.344779 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7ef3f28d-e496-434e-a803-3b9a0fa24690-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.346243 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/745f7801-7150-4924-b9fb-e8a0aa1e7edb-config\") pod \"kube-apiserver-operator-766d6c64bb-djrdt\" (UID: \"745f7801-7150-4924-b9fb-e8a0aa1e7edb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.347093 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10dc7791-eebd-49e9-8d9c-63711119e9d7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hmqv7\" (UID: \"10dc7791-eebd-49e9-8d9c-63711119e9d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.347167 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/685c3b51-a70f-484e-b7db-f98383f75003-srv-cert\") pod \"olm-operator-6b444d44fb-78xr9\" (UID: \"685c3b51-a70f-484e-b7db-f98383f75003\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.347258 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2bceecf8-583d-4e26-9749-f5939280540b-apiservice-cert\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.348098 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:00.848080391 +0000 UTC m=+225.175867067 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.349421 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2g8w\" (UID: \"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.349950 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81a76ad1-da33-4b42-9c0a-d0ada077729a-config-volume\") pod \"dns-default-5jhkc\" (UID: \"81a76ad1-da33-4b42-9c0a-d0ada077729a\") " pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.350915 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-trusted-ca\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.350988 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a83789bf-1523-4d5e-892d-6597aed01b7d-srv-cert\") pod \"catalog-operator-68c6474976-xldvn\" (UID: \"a83789bf-1523-4d5e-892d-6597aed01b7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.351374 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd3a400-6155-44b9-a358-d2cd089db1f6-config\") pod \"service-ca-operator-777779d784-shqhf\" (UID: \"6dd3a400-6155-44b9-a358-d2cd089db1f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.351945 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10dc7791-eebd-49e9-8d9c-63711119e9d7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hmqv7\" (UID: \"10dc7791-eebd-49e9-8d9c-63711119e9d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.359337 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.359592 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a810b51a-5b19-4da9-ad80-05f189d821e4-proxy-tls\") pod \"machine-config-controller-84d6567774-hlk25\" (UID: \"a810b51a-5b19-4da9-ad80-05f189d821e4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.360101 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8jgh7\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.360537 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-tls\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.360609 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a83789bf-1523-4d5e-892d-6597aed01b7d-profile-collector-cert\") pod \"catalog-operator-68c6474976-xldvn\" (UID: \"a83789bf-1523-4d5e-892d-6597aed01b7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.361088 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8c8a6e75-7e5f-41c8-8312-b9d274284f35-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-g9tz2\" (UID: \"8c8a6e75-7e5f-41c8-8312-b9d274284f35\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.369616 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28ce563b-8e5b-4abe-b71b-02c588bff511-metrics-certs\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.370211 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/40014780-8cb8-47fa-8b2c-c4fb7d04a85c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-whlp9\" (UID: \"40014780-8cb8-47fa-8b2c-c4fb7d04a85c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.370445 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/745f7801-7150-4924-b9fb-e8a0aa1e7edb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-djrdt\" (UID: \"745f7801-7150-4924-b9fb-e8a0aa1e7edb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.370921 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7ef3f28d-e496-434e-a803-3b9a0fa24690-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.371992 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/685c3b51-a70f-484e-b7db-f98383f75003-profile-collector-cert\") pod \"olm-operator-6b444d44fb-78xr9\" (UID: \"685c3b51-a70f-484e-b7db-f98383f75003\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.378081 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.391465 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dd3a400-6155-44b9-a358-d2cd089db1f6-serving-cert\") pod \"service-ca-operator-777779d784-shqhf\" (UID: \"6dd3a400-6155-44b9-a358-d2cd089db1f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.392036 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.393019 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lg7c\" (UniqueName: \"kubernetes.io/projected/34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9-kube-api-access-9lg7c\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2g8w\" (UID: \"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.400464 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.407743 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fjmf\" (UniqueName: \"kubernetes.io/projected/a83789bf-1523-4d5e-892d-6597aed01b7d-kube-api-access-7fjmf\") pod \"catalog-operator-68c6474976-xldvn\" (UID: \"a83789bf-1523-4d5e-892d-6597aed01b7d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.422417 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v69rn\" (UniqueName: \"kubernetes.io/projected/6240548e-b827-4fdb-b2be-c7187d6a28e8-kube-api-access-v69rn\") pod \"marketplace-operator-79b997595-8jgh7\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.424908 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.431940 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432264 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ad426123-af7f-45c4-8a6b-bca3c83017be-signing-key\") pod \"service-ca-9c57cc56f-6rrrs\" (UID: \"ad426123-af7f-45c4-8a6b-bca3c83017be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432296 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a178972b-b463-42db-b2c9-dcba9a51c4bc-certs\") pod \"machine-config-server-4sj57\" (UID: \"a178972b-b463-42db-b2c9-dcba9a51c4bc\") " pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432341 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a178972b-b463-42db-b2c9-dcba9a51c4bc-node-bootstrap-token\") pod \"machine-config-server-4sj57\" (UID: \"a178972b-b463-42db-b2c9-dcba9a51c4bc\") " pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432372 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0368223e-2e01-4681-a7a6-67b77387f8d8-secret-volume\") pod \"collect-profiles-29567775-lfv48\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432391 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-csi-data-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432412 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1439545-f492-4e4c-858c-ec85c5c2a9d9-cert\") pod \"ingress-canary-brnnr\" (UID: \"d1439545-f492-4e4c-858c-ec85c5c2a9d9\") " pod="openshift-ingress-canary/ingress-canary-brnnr" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432485 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jr79\" (UniqueName: \"kubernetes.io/projected/ad426123-af7f-45c4-8a6b-bca3c83017be-kube-api-access-2jr79\") pod \"service-ca-9c57cc56f-6rrrs\" (UID: \"ad426123-af7f-45c4-8a6b-bca3c83017be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432508 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-plugins-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432537 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26gxn\" (UniqueName: \"kubernetes.io/projected/a178972b-b463-42db-b2c9-dcba9a51c4bc-kube-api-access-26gxn\") pod \"machine-config-server-4sj57\" (UID: \"a178972b-b463-42db-b2c9-dcba9a51c4bc\") " pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432559 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tf6t\" (UniqueName: \"kubernetes.io/projected/d1439545-f492-4e4c-858c-ec85c5c2a9d9-kube-api-access-8tf6t\") pod \"ingress-canary-brnnr\" (UID: \"d1439545-f492-4e4c-858c-ec85c5c2a9d9\") " pod="openshift-ingress-canary/ingress-canary-brnnr" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432605 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0368223e-2e01-4681-a7a6-67b77387f8d8-config-volume\") pod \"collect-profiles-29567775-lfv48\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432628 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9wch\" (UniqueName: \"kubernetes.io/projected/0368223e-2e01-4681-a7a6-67b77387f8d8-kube-api-access-l9wch\") pod \"collect-profiles-29567775-lfv48\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432679 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-registration-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432728 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjkdf\" (UniqueName: \"kubernetes.io/projected/609ace61-45d1-44f6-b378-fb97eecf2374-kube-api-access-vjkdf\") pod \"auto-csr-approver-29567786-d8w8k\" (UID: \"609ace61-45d1-44f6-b378-fb97eecf2374\") " pod="openshift-infra/auto-csr-approver-29567786-d8w8k" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432778 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ad426123-af7f-45c4-8a6b-bca3c83017be-signing-cabundle\") pod \"service-ca-9c57cc56f-6rrrs\" (UID: \"ad426123-af7f-45c4-8a6b-bca3c83017be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432806 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-socket-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432826 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-mountpoint-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432871 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vffpc\" (UniqueName: \"kubernetes.io/projected/4fee5524-9cb1-48c7-83b6-10bf3230c783-kube-api-access-vffpc\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432889 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-g2rrh"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.432934 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-k5nwf"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.433988 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0368223e-2e01-4681-a7a6-67b77387f8d8-config-volume\") pod \"collect-profiles-29567775-lfv48\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.434140 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-registration-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.434400 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-plugins-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.437409 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a178972b-b463-42db-b2c9-dcba9a51c4bc-node-bootstrap-token\") pod \"machine-config-server-4sj57\" (UID: \"a178972b-b463-42db-b2c9-dcba9a51c4bc\") " pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.440310 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ad426123-af7f-45c4-8a6b-bca3c83017be-signing-key\") pod \"service-ca-9c57cc56f-6rrrs\" (UID: \"ad426123-af7f-45c4-8a6b-bca3c83017be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.440467 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:00.940434703 +0000 UTC m=+225.268221519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.440587 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-socket-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.440911 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-csi-data-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.441024 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/4fee5524-9cb1-48c7-83b6-10bf3230c783-mountpoint-dir\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.445436 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ad426123-af7f-45c4-8a6b-bca3c83017be-signing-cabundle\") pod \"service-ca-9c57cc56f-6rrrs\" (UID: \"ad426123-af7f-45c4-8a6b-bca3c83017be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.445550 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhhkt\" (UniqueName: \"kubernetes.io/projected/a810b51a-5b19-4da9-ad80-05f189d821e4-kube-api-access-zhhkt\") pod \"machine-config-controller-84d6567774-hlk25\" (UID: \"a810b51a-5b19-4da9-ad80-05f189d821e4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.446476 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a178972b-b463-42db-b2c9-dcba9a51c4bc-certs\") pod \"machine-config-server-4sj57\" (UID: \"a178972b-b463-42db-b2c9-dcba9a51c4bc\") " pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.448193 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0368223e-2e01-4681-a7a6-67b77387f8d8-secret-volume\") pod \"collect-profiles-29567775-lfv48\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.450248 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1439545-f492-4e4c-858c-ec85c5c2a9d9-cert\") pod \"ingress-canary-brnnr\" (UID: \"d1439545-f492-4e4c-858c-ec85c5c2a9d9\") " pod="openshift-ingress-canary/ingress-canary-brnnr" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.462062 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.465332 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtldg\" (UniqueName: \"kubernetes.io/projected/2bceecf8-583d-4e26-9749-f5939280540b-kube-api-access-jtldg\") pod \"packageserver-d55dfcdfc-vg8dq\" (UID: \"2bceecf8-583d-4e26-9749-f5939280540b\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.486811 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6s9q\" (UniqueName: \"kubernetes.io/projected/6dd3a400-6155-44b9-a358-d2cd089db1f6-kube-api-access-d6s9q\") pod \"service-ca-operator-777779d784-shqhf\" (UID: \"6dd3a400-6155-44b9-a358-d2cd089db1f6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.487982 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.509633 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmkvx\" (UniqueName: \"kubernetes.io/projected/81a76ad1-da33-4b42-9c0a-d0ada077729a-kube-api-access-rmkvx\") pod \"dns-default-5jhkc\" (UID: \"81a76ad1-da33-4b42-9c0a-d0ada077729a\") " pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.526436 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6nxn\" (UniqueName: \"kubernetes.io/projected/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-kube-api-access-t6nxn\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.535531 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" event={"ID":"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1","Type":"ContainerStarted","Data":"bebe147c7d3c9f550ac210dc3b87fa28986a146b1b4c1a309591b5d1c2e502e6"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.536293 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.537709 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.037680161 +0000 UTC m=+225.365466837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.543677 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmtdf\" (UniqueName: \"kubernetes.io/projected/8c8a6e75-7e5f-41c8-8312-b9d274284f35-kube-api-access-vmtdf\") pod \"multus-admission-controller-857f4d67dd-g9tz2\" (UID: \"8c8a6e75-7e5f-41c8-8312-b9d274284f35\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.544079 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" event={"ID":"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5","Type":"ContainerStarted","Data":"69c4299d7bce6eb82eeb4f3117432443b42c0d2372ac0441df1988528f83756e"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.551467 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qp8mz" event={"ID":"4d63cdfd-21e7-4a63-960b-363fb131ac08","Type":"ContainerStarted","Data":"3f550c2c325d41fb1c4343671414703435e40bc13120f59039726685c708adaf"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.553763 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" event={"ID":"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf","Type":"ContainerStarted","Data":"104bf948e8df0428dbcca58a1fb16cf0fb51aec4d7ecc1e8a05a5ab60ffd5268"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.562346 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" event={"ID":"cae2f42f-b7c7-43c7-b397-a8273ea5844b","Type":"ContainerStarted","Data":"12ed70382799e5e4264fe38403057c4e4da70ccbeb46a9c9bf332c2f17ea1512"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.564334 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfdvw\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-kube-api-access-pfdvw\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.567013 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" event={"ID":"e81e2384-94b0-4639-bb2d-e4152385c932","Type":"ContainerStarted","Data":"e0cf19f30a06660139a2f20b41696b02c7e76a7a7208cb22acb223aa74d717d6"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.567052 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" event={"ID":"e81e2384-94b0-4639-bb2d-e4152385c932","Type":"ContainerStarted","Data":"14f69a0375e6c5d03a334a0b16024be0f89d91b54fd01559c41e7673087b6d53"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.567853 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.569636 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" event={"ID":"3498feaf-72d5-471a-b25e-fb4b68875767","Type":"ContainerStarted","Data":"3be930827d4d25e493d6f8a65c67c37e9fd4d484ec4f005269d96061246be637"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.569666 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" event={"ID":"3498feaf-72d5-471a-b25e-fb4b68875767","Type":"ContainerStarted","Data":"37e1b23e6295895c07381fb6ccbe7c11a0a79e23312f6edb20749b2a0cf5c684"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.570179 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.572555 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" event={"ID":"6bfbd19d-a44a-459c-bd6e-150241ce3ebb","Type":"ContainerStarted","Data":"acab2b98e2e3828439a02d33c7b3fd1855365edb0946861b3e5dc01800f9adfe"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.574112 4839 generic.go:334] "Generic (PLEG): container finished" podID="67adff78-dfe5-440a-80b0-fefd703c3aa7" containerID="86006a257303cbb685395d36b051f2b2669e91a34cb7a57c065b2e9ca3122ed3" exitCode=0 Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.574205 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" event={"ID":"67adff78-dfe5-440a-80b0-fefd703c3aa7","Type":"ContainerDied","Data":"86006a257303cbb685395d36b051f2b2669e91a34cb7a57c065b2e9ca3122ed3"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.577929 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m9qn\" (UniqueName: \"kubernetes.io/projected/40014780-8cb8-47fa-8b2c-c4fb7d04a85c-kube-api-access-8m9qn\") pod \"control-plane-machine-set-operator-78cbb6b69f-whlp9\" (UID: \"40014780-8cb8-47fa-8b2c-c4fb7d04a85c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.579956 4839 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-45jfn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.579988 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" podUID="3498feaf-72d5-471a-b25e-fb4b68875767" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.580289 4839 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-76ctz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.580310 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" podUID="e81e2384-94b0-4639-bb2d-e4152385c932" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.589253 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.591476 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" event={"ID":"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54","Type":"ContainerStarted","Data":"db930d1be9182a20b88151b3736501f6341e4e6a1cbd302ae707a449c2596737"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.591525 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" event={"ID":"c4d393d7-42d7-4b7d-a3cd-f7e325b97c54","Type":"ContainerStarted","Data":"00b4c67df5fe9cc89fd22031364169470c3fcfc89d4ce53b74bb4b6af15acd7a"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.603099 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbz79\" (UniqueName: \"kubernetes.io/projected/7f2c6e22-6a88-4c63-9da2-e38b813e0f1c-kube-api-access-dbz79\") pod \"migrator-59844c95c7-blcpt\" (UID: \"7f2c6e22-6a88-4c63-9da2-e38b813e0f1c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.617113 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75c2v\" (UniqueName: \"kubernetes.io/projected/685c3b51-a70f-484e-b7db-f98383f75003-kube-api-access-75c2v\") pod \"olm-operator-6b444d44fb-78xr9\" (UID: \"685c3b51-a70f-484e-b7db-f98383f75003\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.617142 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" event={"ID":"9d291bc8-87c0-4a9e-b269-52a7801f050b","Type":"ContainerStarted","Data":"2a0fd9145ce6bbc2d66d5309033c5df6833aa692240adbf22a21b5c347926398"} Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.630080 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.637964 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.638006 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.638062 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.138040372 +0000 UTC m=+225.465827048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.638516 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.638813 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/745f7801-7150-4924-b9fb-e8a0aa1e7edb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-djrdt\" (UID: \"745f7801-7150-4924-b9fb-e8a0aa1e7edb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.639511 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.139490425 +0000 UTC m=+225.467277101 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.656894 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9m6tl\" (UID: \"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.675539 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.683844 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.684295 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-627w7\" (UniqueName: \"kubernetes.io/projected/28ce563b-8e5b-4abe-b71b-02c588bff511-kube-api-access-627w7\") pod \"router-default-5444994796-w6dzs\" (UID: \"28ce563b-8e5b-4abe-b71b-02c588bff511\") " pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.691711 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.706448 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2s6j7"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.706688 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10dc7791-eebd-49e9-8d9c-63711119e9d7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hmqv7\" (UID: \"10dc7791-eebd-49e9-8d9c-63711119e9d7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.707336 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.716629 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.728482 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-bound-sa-token\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.744180 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.744393 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.744460 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.244441483 +0000 UTC m=+225.572228159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.744948 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.745688 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.24567298 +0000 UTC m=+225.573459646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.748959 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.752963 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.768188 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.779246 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.781158 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vffpc\" (UniqueName: \"kubernetes.io/projected/4fee5524-9cb1-48c7-83b6-10bf3230c783-kube-api-access-vffpc\") pod \"csi-hostpathplugin-cstqb\" (UID: \"4fee5524-9cb1-48c7-83b6-10bf3230c783\") " pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.786134 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26gxn\" (UniqueName: \"kubernetes.io/projected/a178972b-b463-42db-b2c9-dcba9a51c4bc-kube-api-access-26gxn\") pod \"machine-config-server-4sj57\" (UID: \"a178972b-b463-42db-b2c9-dcba9a51c4bc\") " pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.798613 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.802268 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tf6t\" (UniqueName: \"kubernetes.io/projected/d1439545-f492-4e4c-858c-ec85c5c2a9d9-kube-api-access-8tf6t\") pod \"ingress-canary-brnnr\" (UID: \"d1439545-f492-4e4c-858c-ec85c5c2a9d9\") " pod="openshift-ingress-canary/ingress-canary-brnnr" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.806469 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-4sj57" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.820542 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9wch\" (UniqueName: \"kubernetes.io/projected/0368223e-2e01-4681-a7a6-67b77387f8d8-kube-api-access-l9wch\") pod \"collect-profiles-29567775-lfv48\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.839140 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-brnnr" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.842808 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cstqb" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.847052 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.847400 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.347383581 +0000 UTC m=+225.675170257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.848862 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjkdf\" (UniqueName: \"kubernetes.io/projected/609ace61-45d1-44f6-b378-fb97eecf2374-kube-api-access-vjkdf\") pod \"auto-csr-approver-29567786-d8w8k\" (UID: \"609ace61-45d1-44f6-b378-fb97eecf2374\") " pod="openshift-infra/auto-csr-approver-29567786-d8w8k" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.865864 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hkg98"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.868873 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-bj929"] Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.873355 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jr79\" (UniqueName: \"kubernetes.io/projected/ad426123-af7f-45c4-8a6b-bca3c83017be-kube-api-access-2jr79\") pod \"service-ca-9c57cc56f-6rrrs\" (UID: \"ad426123-af7f-45c4-8a6b-bca3c83017be\") " pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:27:00 crc kubenswrapper[4839]: W0321 04:27:00.911420 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3db892a0_fb40_4e0e_93ee_a8f2876ad8be.slice/crio-257ab6cb6e299cc5ad93a18838e91cd1a2df571cc8cf41230ae4ec84ecc3404e WatchSource:0}: Error finding container 257ab6cb6e299cc5ad93a18838e91cd1a2df571cc8cf41230ae4ec84ecc3404e: Status 404 returned error can't find the container with id 257ab6cb6e299cc5ad93a18838e91cd1a2df571cc8cf41230ae4ec84ecc3404e Mar 21 04:27:00 crc kubenswrapper[4839]: W0321 04:27:00.944831 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebdfec0a_a8bf_47b0_b51a_75a76d4341f2.slice/crio-774dc5e188fdd2949c4be19591127c4007be44d80647d0129310982be9176b4a WatchSource:0}: Error finding container 774dc5e188fdd2949c4be19591127c4007be44d80647d0129310982be9176b4a: Status 404 returned error can't find the container with id 774dc5e188fdd2949c4be19591127c4007be44d80647d0129310982be9176b4a Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.962382 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.965986 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:00 crc kubenswrapper[4839]: W0321 04:27:00.966627 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28ce563b_8e5b_4abe_b71b_02c588bff511.slice/crio-9397277df1875a90eed39cc6dd9f436aae3aa7999383ad4c20bd29d0e86597d9 WatchSource:0}: Error finding container 9397277df1875a90eed39cc6dd9f436aae3aa7999383ad4c20bd29d0e86597d9: Status 404 returned error can't find the container with id 9397277df1875a90eed39cc6dd9f436aae3aa7999383ad4c20bd29d0e86597d9 Mar 21 04:27:00 crc kubenswrapper[4839]: E0321 04:27:00.966979 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.466959697 +0000 UTC m=+225.794746373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:00 crc kubenswrapper[4839]: I0321 04:27:00.970042 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8jgh7"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.028246 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.075736 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.076099 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.57607883 +0000 UTC m=+225.903865506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.076638 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.077369 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.577358378 +0000 UTC m=+225.905145054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.098004 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" Mar 21 04:27:01 crc kubenswrapper[4839]: W0321 04:27:01.100225 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6240548e_b827_4fdb_b2be_c7187d6a28e8.slice/crio-dec41352b22dc4b1f265aecf13bbf9f995403b64a2bfc4f44c88616523722931 WatchSource:0}: Error finding container dec41352b22dc4b1f265aecf13bbf9f995403b64a2bfc4f44c88616523722931: Status 404 returned error can't find the container with id dec41352b22dc4b1f265aecf13bbf9f995403b64a2bfc4f44c88616523722931 Mar 21 04:27:01 crc kubenswrapper[4839]: W0321 04:27:01.101643 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda79fc033_c671_42ff_aa06_78ae64967c92.slice/crio-142271d8088c37908157cb75e28ad863f196a6dd38dd6b39f7a315657d6ce315 WatchSource:0}: Error finding container 142271d8088c37908157cb75e28ad863f196a6dd38dd6b39f7a315657d6ce315: Status 404 returned error can't find the container with id 142271d8088c37908157cb75e28ad863f196a6dd38dd6b39f7a315657d6ce315 Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.111358 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.131517 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.186076 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.186300 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.686255134 +0000 UTC m=+226.014041810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.186591 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.187416 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.687408369 +0000 UTC m=+226.015195045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.197977 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-nmj8p" podStartSLOduration=160.197959424 podStartE2EDuration="2m40.197959424s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:01.197778839 +0000 UTC m=+225.525565515" watchObservedRunningTime="2026-03-21 04:27:01.197959424 +0000 UTC m=+225.525746100" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.234140 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.288349 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.289276 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.789255144 +0000 UTC m=+226.117041820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: W0321 04:27:01.323026 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bc52acb_29f0_4f24_a46a_928a529264dc.slice/crio-1ec9bfe764509947efa939fe8f8f570a445c6f87cdfb141b97b2916972a436e8 WatchSource:0}: Error finding container 1ec9bfe764509947efa939fe8f8f570a445c6f87cdfb141b97b2916972a436e8: Status 404 returned error can't find the container with id 1ec9bfe764509947efa939fe8f8f570a445c6f87cdfb141b97b2916972a436e8 Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.374707 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.376666 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-shqhf"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.391904 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.392256 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.892238304 +0000 UTC m=+226.220024980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.401422 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.423013 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.444216 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.454950 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.488482 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.491579 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" podStartSLOduration=161.491540553 podStartE2EDuration="2m41.491540553s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:01.484855883 +0000 UTC m=+225.812642569" watchObservedRunningTime="2026-03-21 04:27:01.491540553 +0000 UTC m=+225.819327229" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.494323 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.496212 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.496893 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:01.996869922 +0000 UTC m=+226.324656598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.560911 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" podStartSLOduration=160.560895157 podStartE2EDuration="2m40.560895157s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:01.513982004 +0000 UTC m=+225.841768680" watchObservedRunningTime="2026-03-21 04:27:01.560895157 +0000 UTC m=+225.888681833" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.561666 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt"] Mar 21 04:27:01 crc kubenswrapper[4839]: W0321 04:27:01.566589 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda178972b_b463_42db_b2c9_dcba9a51c4bc.slice/crio-7e4bb9e4330247c67d80afd7268f7a8acc7be6291db286ad65c56513d5057c79 WatchSource:0}: Error finding container 7e4bb9e4330247c67d80afd7268f7a8acc7be6291db286ad65c56513d5057c79: Status 404 returned error can't find the container with id 7e4bb9e4330247c67d80afd7268f7a8acc7be6291db286ad65c56513d5057c79 Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.567186 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.598344 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.600337 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.100314706 +0000 UTC m=+226.428101382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: W0321 04:27:01.639015 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34efe2c8_d7a8_47c1_8890_85ebd5ef1eb9.slice/crio-f687bfd38b7cfa3b591f3fb6b760992a834ed7768fb5ee365a16b165dfb16871 WatchSource:0}: Error finding container f687bfd38b7cfa3b591f3fb6b760992a834ed7768fb5ee365a16b165dfb16871: Status 404 returned error can't find the container with id f687bfd38b7cfa3b591f3fb6b760992a834ed7768fb5ee365a16b165dfb16871 Mar 21 04:27:01 crc kubenswrapper[4839]: W0321 04:27:01.639490 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dd3a400_6155_44b9_a358_d2cd089db1f6.slice/crio-570791c95d0b2b190c1d8da17592465418d253b93a8afbc494ffb34625ed0073 WatchSource:0}: Error finding container 570791c95d0b2b190c1d8da17592465418d253b93a8afbc494ffb34625ed0073: Status 404 returned error can't find the container with id 570791c95d0b2b190c1d8da17592465418d253b93a8afbc494ffb34625ed0073 Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.643662 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" event={"ID":"c7fd2012-4ccf-4fb6-ad00-eeb7fd6be2cf","Type":"ContainerStarted","Data":"87165f3b58c7c76de9f3e3e80e2dcb2a93d5a302c209109ff14fff2635704ea2"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.662620 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" event={"ID":"25af4e9d-c029-4ee7-9952-18a3a5e3c333","Type":"ContainerStarted","Data":"e50502a2010cac7980cfefa0621f469d7b42ed82cc45209c2b4818de457bba55"} Mar 21 04:27:01 crc kubenswrapper[4839]: W0321 04:27:01.663323 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda810b51a_5b19_4da9_ad80_05f189d821e4.slice/crio-2bac721ba6d2c55bdfbd0753be9818e4659f19ed6afa991c8d3cd42bf3709d6d WatchSource:0}: Error finding container 2bac721ba6d2c55bdfbd0753be9818e4659f19ed6afa991c8d3cd42bf3709d6d: Status 404 returned error can't find the container with id 2bac721ba6d2c55bdfbd0753be9818e4659f19ed6afa991c8d3cd42bf3709d6d Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.673298 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" event={"ID":"93bc1508-a828-4d23-b078-1d4164d1bc2c","Type":"ContainerStarted","Data":"94c766f682c6ebef7629e40f94406109f19df712e94a944ff6c5ac196f0815cf"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.677444 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" event={"ID":"67adff78-dfe5-440a-80b0-fefd703c3aa7","Type":"ContainerStarted","Data":"f6b6c78c9d1bdb848eba4c2502571d555e01e3ddf133248352c6864deb900c4d"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.681269 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hkg98" event={"ID":"f7156267-6917-4c54-ba75-4a91a0772025","Type":"ContainerStarted","Data":"bec5b782b49b348195a3494de220af759b7d819426abf2092fef180efedbebb1"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.691620 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qp8mz" event={"ID":"4d63cdfd-21e7-4a63-960b-363fb131ac08","Type":"ContainerStarted","Data":"c59c5df1b834753b3397abb13d229ecb94c80f25b00f08838046e04f48ad820c"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.691796 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qp8mz" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.694737 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-w6dzs" event={"ID":"28ce563b-8e5b-4abe-b71b-02c588bff511","Type":"ContainerStarted","Data":"9397277df1875a90eed39cc6dd9f436aae3aa7999383ad4c20bd29d0e86597d9"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.696841 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" event={"ID":"6bfbd19d-a44a-459c-bd6e-150241ce3ebb","Type":"ContainerStarted","Data":"5917d0257c4a81565499bf920cd6ba405e8b8d34fcd640889d185cadd9ae650d"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.697556 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.700989 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.701622 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.201594584 +0000 UTC m=+226.529381260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.701818 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.702054 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" event={"ID":"2bc52acb-29f0-4f24-a46a-928a529264dc","Type":"ContainerStarted","Data":"1ec9bfe764509947efa939fe8f8f570a445c6f87cdfb141b97b2916972a436e8"} Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.702555 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.202544933 +0000 UTC m=+226.530331609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.705984 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" event={"ID":"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1","Type":"ContainerStarted","Data":"7dadea2ef0ce3a1677e46665b198a5ba8d6e517c88efd0a391f511f13f383ce4"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.730666 4839 generic.go:334] "Generic (PLEG): container finished" podID="9d291bc8-87c0-4a9e-b269-52a7801f050b" containerID="60becfbe63e155a2f60cd187ea8d4be4fbd710e4f3d615c52e807eb20a456a0f" exitCode=0 Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.730866 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" event={"ID":"9d291bc8-87c0-4a9e-b269-52a7801f050b","Type":"ContainerDied","Data":"60becfbe63e155a2f60cd187ea8d4be4fbd710e4f3d615c52e807eb20a456a0f"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.733818 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4sj57" event={"ID":"a178972b-b463-42db-b2c9-dcba9a51c4bc","Type":"ContainerStarted","Data":"7e4bb9e4330247c67d80afd7268f7a8acc7be6291db286ad65c56513d5057c79"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.735207 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bj929" event={"ID":"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2","Type":"ContainerStarted","Data":"774dc5e188fdd2949c4be19591127c4007be44d80647d0129310982be9176b4a"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.741322 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" event={"ID":"a79fc033-c671-42ff-aa06-78ae64967c92","Type":"ContainerStarted","Data":"142271d8088c37908157cb75e28ad863f196a6dd38dd6b39f7a315657d6ce315"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.749933 4839 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-zt77f container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.750017 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" podUID="6bfbd19d-a44a-459c-bd6e-150241ce3ebb" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.751506 4839 patch_prober.go:28] interesting pod/downloads-7954f5f757-qp8mz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.751630 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qp8mz" podUID="4d63cdfd-21e7-4a63-960b-363fb131ac08" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.820664 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.823338 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.323314364 +0000 UTC m=+226.651101040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.865858 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" event={"ID":"cae2f42f-b7c7-43c7-b397-a8273ea5844b","Type":"ContainerStarted","Data":"8b2e1ca7f6124db0444244a4f24ba74da0d7c7ef5546bae4cc34079e0bc014c1"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.872073 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.876984 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" event={"ID":"6240548e-b827-4fdb-b2be-c7187d6a28e8","Type":"ContainerStarted","Data":"dec41352b22dc4b1f265aecf13bbf9f995403b64a2bfc4f44c88616523722931"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.885069 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" event={"ID":"3db892a0-fb40-4e0e-93ee-a8f2876ad8be","Type":"ContainerStarted","Data":"257ab6cb6e299cc5ad93a18838e91cd1a2df571cc8cf41230ae4ec84ecc3404e"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.887796 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" event={"ID":"b60d6f1b-b109-4fa4-a85d-ebb845b342bd","Type":"ContainerStarted","Data":"36a30ee3de09b6a8070c6d8df57ed3e960516cc6813b51d774b7ac64b759f079"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.895837 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" event={"ID":"f7a32fe8-bd63-44c6-a90c-a6a5438e3cd5","Type":"ContainerStarted","Data":"18ebfd00c2bdfa1069adfe9873fe32d14a8d869d0ed9b0b7f2aafcdd2abbfa77"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.898862 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.923150 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cstqb"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.923761 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" event={"ID":"28599c04-0840-41a0-91dd-c0ed5bcf99fd","Type":"ContainerStarted","Data":"df9320e90b9c1289c659ebd828f5ec8dc8624d7becb6e45520b5ad077c294540"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.923785 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" event={"ID":"28599c04-0840-41a0-91dd-c0ed5bcf99fd","Type":"ContainerStarted","Data":"5e968af271e571456f1fc861cbce1ec77c551902e916a8cd6bcf6e1d56aff536"} Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.925318 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.925541 4839 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-76ctz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.925604 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" podUID="e81e2384-94b0-4639-bb2d-e4152385c932" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 21 04:27:01 crc kubenswrapper[4839]: E0321 04:27:01.926144 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.426131069 +0000 UTC m=+226.753917745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.933100 4839 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-45jfn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.933186 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" podUID="3498feaf-72d5-471a-b25e-fb4b68875767" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.942796 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-g9tz2"] Mar 21 04:27:01 crc kubenswrapper[4839]: I0321 04:27:01.949097 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9"] Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.006395 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48"] Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.035040 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.036233 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.53621556 +0000 UTC m=+226.864002236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: W0321 04:27:02.041379 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fc77b0b_af5d_42f5_8fd1_69ac8d6616d8.slice/crio-3c79577dadd9928c69db7c80eda570ad44512d4d86f82078636cc99a6209ca04 WatchSource:0}: Error finding container 3c79577dadd9928c69db7c80eda570ad44512d4d86f82078636cc99a6209ca04: Status 404 returned error can't find the container with id 3c79577dadd9928c69db7c80eda570ad44512d4d86f82078636cc99a6209ca04 Mar 21 04:27:02 crc kubenswrapper[4839]: W0321 04:27:02.042746 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bceecf8_583d_4e26_9749_f5939280540b.slice/crio-f454b3f61918548eed258d39d1cc2445884b335cf1cb9cc8888fef8a8986c650 WatchSource:0}: Error finding container f454b3f61918548eed258d39d1cc2445884b335cf1cb9cc8888fef8a8986c650: Status 404 returned error can't find the container with id f454b3f61918548eed258d39d1cc2445884b335cf1cb9cc8888fef8a8986c650 Mar 21 04:27:02 crc kubenswrapper[4839]: W0321 04:27:02.047885 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod685c3b51_a70f_484e_b7db_f98383f75003.slice/crio-046845a3c18469b2c4953cb4c3caa7c880744e360ea33c1ec1534a3bfca174e1 WatchSource:0}: Error finding container 046845a3c18469b2c4953cb4c3caa7c880744e360ea33c1ec1534a3bfca174e1: Status 404 returned error can't find the container with id 046845a3c18469b2c4953cb4c3caa7c880744e360ea33c1ec1534a3bfca174e1 Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.061427 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567786-d8w8k"] Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.109010 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5jhkc"] Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.126183 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-brnnr"] Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.129173 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6rrrs"] Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.139402 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.140522 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.640487208 +0000 UTC m=+226.968273884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.143708 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 04:27:02 crc kubenswrapper[4839]: W0321 04:27:02.143650 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81a76ad1_da33_4b42_9c0a_d0ada077729a.slice/crio-a9427ef3e75a30fe9f8aa880b7baddc8b52f8630e29a8f3524aaed9baa58f7f8 WatchSource:0}: Error finding container a9427ef3e75a30fe9f8aa880b7baddc8b52f8630e29a8f3524aaed9baa58f7f8: Status 404 returned error can't find the container with id a9427ef3e75a30fe9f8aa880b7baddc8b52f8630e29a8f3524aaed9baa58f7f8 Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.240464 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.241137 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.741117338 +0000 UTC m=+227.068904014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.241293 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.241654 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.741647483 +0000 UTC m=+227.069434159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.346320 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.346468 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.846443327 +0000 UTC m=+227.174230003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.347658 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.348290 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.848275012 +0000 UTC m=+227.176061688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.449886 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.450598 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:02.950583081 +0000 UTC m=+227.278369757 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.545648 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" podStartSLOduration=161.545623473 podStartE2EDuration="2m41.545623473s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:02.542815129 +0000 UTC m=+226.870601815" watchObservedRunningTime="2026-03-21 04:27:02.545623473 +0000 UTC m=+226.873410149" Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.551406 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.551964 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:03.051952522 +0000 UTC m=+227.379739198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.590920 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8sv4j" podStartSLOduration=162.590905437 podStartE2EDuration="2m42.590905437s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:02.568040043 +0000 UTC m=+226.895826729" watchObservedRunningTime="2026-03-21 04:27:02.590905437 +0000 UTC m=+226.918692113" Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.591636 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-qp8mz" podStartSLOduration=162.591629649 podStartE2EDuration="2m42.591629649s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:02.591103673 +0000 UTC m=+226.918890349" watchObservedRunningTime="2026-03-21 04:27:02.591629649 +0000 UTC m=+226.919416325" Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.639091 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-k5nwf" podStartSLOduration=162.639074568 podStartE2EDuration="2m42.639074568s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:02.637797139 +0000 UTC m=+226.965583805" watchObservedRunningTime="2026-03-21 04:27:02.639074568 +0000 UTC m=+226.966861234" Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.653190 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.653682 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:03.153658764 +0000 UTC m=+227.481445440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.736942 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" podStartSLOduration=162.736925374 podStartE2EDuration="2m42.736925374s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:02.682531697 +0000 UTC m=+227.010318383" watchObservedRunningTime="2026-03-21 04:27:02.736925374 +0000 UTC m=+227.064712050" Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.759545 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.760140 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:03.260123497 +0000 UTC m=+227.587910173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.867876 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.868192 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:03.368174148 +0000 UTC m=+227.695960824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.955066 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" event={"ID":"2bc52acb-29f0-4f24-a46a-928a529264dc","Type":"ContainerStarted","Data":"462a8bb447e356d4b7a74bcc52ea16689ebef97585cfaf2dd0b1940d90242dbe"} Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.972396 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:02 crc kubenswrapper[4839]: E0321 04:27:02.972796 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:03.472783056 +0000 UTC m=+227.800569732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.973269 4839 generic.go:334] "Generic (PLEG): container finished" podID="b60d6f1b-b109-4fa4-a85d-ebb845b342bd" containerID="045c9336a13b752744cb5d13222b2d90daf20b00e721612357866f428a0e3828" exitCode=0 Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.974461 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" event={"ID":"b60d6f1b-b109-4fa4-a85d-ebb845b342bd","Type":"ContainerDied","Data":"045c9336a13b752744cb5d13222b2d90daf20b00e721612357866f428a0e3828"} Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.984460 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" event={"ID":"a810b51a-5b19-4da9-ad80-05f189d821e4","Type":"ContainerStarted","Data":"3720dcdc84d0b541718745f3af152af7b1b68a7618c107c94916993866a70cf0"} Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.984510 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" event={"ID":"a810b51a-5b19-4da9-ad80-05f189d821e4","Type":"ContainerStarted","Data":"2bac721ba6d2c55bdfbd0753be9818e4659f19ed6afa991c8d3cd42bf3709d6d"} Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.993234 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt" event={"ID":"7f2c6e22-6a88-4c63-9da2-e38b813e0f1c","Type":"ContainerStarted","Data":"e2b84720a1b200524901f41ca64fd70986ddf0fdd064e6d873edb1c351855d72"} Mar 21 04:27:02 crc kubenswrapper[4839]: I0321 04:27:02.993759 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt" event={"ID":"7f2c6e22-6a88-4c63-9da2-e38b813e0f1c","Type":"ContainerStarted","Data":"b088cc5d0a87ccf4d21ca157a589c34e462b3620d9b14feb6ce67e36f4b78c7a"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.003079 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" event={"ID":"6240548e-b827-4fdb-b2be-c7187d6a28e8","Type":"ContainerStarted","Data":"10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.005134 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.008046 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-brnnr" event={"ID":"d1439545-f492-4e4c-858c-ec85c5c2a9d9","Type":"ContainerStarted","Data":"187e50ca092521b2aa02a610aa2b4cadf43ce1a33f7adfcc7192fe95d3f50787"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.014531 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" event={"ID":"25af4e9d-c029-4ee7-9952-18a3a5e3c333","Type":"ContainerStarted","Data":"9143e2ccac432e733a084565d2fa1c38d821357e0d475219c7271e385d278800"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.014609 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" event={"ID":"25af4e9d-c029-4ee7-9952-18a3a5e3c333","Type":"ContainerStarted","Data":"eaba76d3d3d621142d7292023ad530d32d9757f6e702f3b0e5deb428624e31ae"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.023619 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" podStartSLOduration=162.023598636 podStartE2EDuration="2m42.023598636s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.020424121 +0000 UTC m=+227.348210797" watchObservedRunningTime="2026-03-21 04:27:03.023598636 +0000 UTC m=+227.351385312" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.025138 4839 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8jgh7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.025185 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" podUID="6240548e-b827-4fdb-b2be-c7187d6a28e8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.025973 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" event={"ID":"609ace61-45d1-44f6-b378-fb97eecf2374","Type":"ContainerStarted","Data":"3881096e968c291ccdd0e957e85d1c17697b418b86707f4eba8dd532d8654b50"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.041548 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tz8sp" podStartSLOduration=163.041532702 podStartE2EDuration="2m43.041532702s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.040372038 +0000 UTC m=+227.368158714" watchObservedRunningTime="2026-03-21 04:27:03.041532702 +0000 UTC m=+227.369319398" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.043694 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5jhkc" event={"ID":"81a76ad1-da33-4b42-9c0a-d0ada077729a","Type":"ContainerStarted","Data":"a9427ef3e75a30fe9f8aa880b7baddc8b52f8630e29a8f3524aaed9baa58f7f8"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.058000 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" event={"ID":"685c3b51-a70f-484e-b7db-f98383f75003","Type":"ContainerStarted","Data":"046845a3c18469b2c4953cb4c3caa7c880744e360ea33c1ec1534a3bfca174e1"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.073311 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:03 crc kubenswrapper[4839]: E0321 04:27:03.073727 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:03.573708704 +0000 UTC m=+227.901495380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.082440 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cstqb" event={"ID":"4fee5524-9cb1-48c7-83b6-10bf3230c783","Type":"ContainerStarted","Data":"91fdc9d2521333990da8d1f6444e8c42711f0aa0359076d3070c0e7a6f7242e2"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.088021 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4sj57" event={"ID":"a178972b-b463-42db-b2c9-dcba9a51c4bc","Type":"ContainerStarted","Data":"133a77624e75b046fd14ceaebf5844fd31bd6a20c4335f0394025db10d528268"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.091807 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" event={"ID":"28599c04-0840-41a0-91dd-c0ed5bcf99fd","Type":"ContainerStarted","Data":"5ce59bfc9a8d76f61b475a748505225dea02075e980d4be95f046cacab374fb3"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.092154 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.098168 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" event={"ID":"14ed32fc-196a-4d5e-a8f3-d9fc6ec765b1","Type":"ContainerStarted","Data":"10517d841e330452bbfc351ceb4a23f6516213a6e7439f1777eb972e63ebd915"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.143794 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-4sj57" podStartSLOduration=6.143772369 podStartE2EDuration="6.143772369s" podCreationTimestamp="2026-03-21 04:26:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.119835484 +0000 UTC m=+227.447622170" watchObservedRunningTime="2026-03-21 04:27:03.143772369 +0000 UTC m=+227.471559045" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.156778 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" event={"ID":"745f7801-7150-4924-b9fb-e8a0aa1e7edb","Type":"ContainerStarted","Data":"ace0517a65be11d7ab5a93fb88093741fe4b523a4638575f37a6b995e5c62697"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.156838 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" event={"ID":"745f7801-7150-4924-b9fb-e8a0aa1e7edb","Type":"ContainerStarted","Data":"f676c0c1b6ca31caace169e795dc89298d1ab5fa78697ed87adc8f350eda2000"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.172187 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fh8k5" podStartSLOduration=163.172165669 podStartE2EDuration="2m43.172165669s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.168156019 +0000 UTC m=+227.495942695" watchObservedRunningTime="2026-03-21 04:27:03.172165669 +0000 UTC m=+227.499952345" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.177604 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:03 crc kubenswrapper[4839]: E0321 04:27:03.183427 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:03.683409705 +0000 UTC m=+228.011196371 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.200602 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" podStartSLOduration=162.200588478 podStartE2EDuration="2m42.200588478s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.199922289 +0000 UTC m=+227.527708965" watchObservedRunningTime="2026-03-21 04:27:03.200588478 +0000 UTC m=+227.528375154" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.213926 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" event={"ID":"6dd3a400-6155-44b9-a358-d2cd089db1f6","Type":"ContainerStarted","Data":"17a65b035567eb719a2f53758708b74e68f6b1f5d9b8593826375378484ff4e1"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.213975 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" event={"ID":"6dd3a400-6155-44b9-a358-d2cd089db1f6","Type":"ContainerStarted","Data":"570791c95d0b2b190c1d8da17592465418d253b93a8afbc494ffb34625ed0073"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.231025 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" event={"ID":"8c8a6e75-7e5f-41c8-8312-b9d274284f35","Type":"ContainerStarted","Data":"cd60520aa8e3d25f4578412e6964fff5da176e03ce4c18684a063c91d4747cba"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.249255 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" event={"ID":"ad426123-af7f-45c4-8a6b-bca3c83017be","Type":"ContainerStarted","Data":"1a56073ecc981d7ea1066ad2e2a545424cc1438c17c937db0ed6b70dfc89f735"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.258080 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-shqhf" podStartSLOduration=162.258062397 podStartE2EDuration="2m42.258062397s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.257648805 +0000 UTC m=+227.585435481" watchObservedRunningTime="2026-03-21 04:27:03.258062397 +0000 UTC m=+227.585849073" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.261296 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-djrdt" podStartSLOduration=163.261280603 podStartE2EDuration="2m43.261280603s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.228190334 +0000 UTC m=+227.555977010" watchObservedRunningTime="2026-03-21 04:27:03.261280603 +0000 UTC m=+227.589067279" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.275257 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" event={"ID":"93bc1508-a828-4d23-b078-1d4164d1bc2c","Type":"ContainerStarted","Data":"a2e9dbd6a8e6a22842ccb2c8c268abb030d9130744c130549c78eba74a31e38b"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.278383 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:03 crc kubenswrapper[4839]: E0321 04:27:03.278758 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:03.778743605 +0000 UTC m=+228.106530281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.316540 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2s6j7" podStartSLOduration=163.316516015 podStartE2EDuration="2m43.316516015s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.315717141 +0000 UTC m=+227.643503827" watchObservedRunningTime="2026-03-21 04:27:03.316516015 +0000 UTC m=+227.644302691" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.322337 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" event={"ID":"2bceecf8-583d-4e26-9749-f5939280540b","Type":"ContainerStarted","Data":"f454b3f61918548eed258d39d1cc2445884b335cf1cb9cc8888fef8a8986c650"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.350016 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-w6dzs" event={"ID":"28ce563b-8e5b-4abe-b71b-02c588bff511","Type":"ContainerStarted","Data":"08587c23a111b9d07306bacd6540f579f28544379d121f7c40b0619835c75da7"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.378303 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" event={"ID":"3db892a0-fb40-4e0e-93ee-a8f2876ad8be","Type":"ContainerStarted","Data":"8bc2f6e184e6ff56e260fca04b6c5b3b4a0ad543df5a1345d96246b3e1860851"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.379966 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:03 crc kubenswrapper[4839]: E0321 04:27:03.380536 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:03.880518429 +0000 UTC m=+228.208305185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.389213 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-w6dzs" podStartSLOduration=163.389193548 podStartE2EDuration="2m43.389193548s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.38792966 +0000 UTC m=+227.715716336" watchObservedRunningTime="2026-03-21 04:27:03.389193548 +0000 UTC m=+227.716980224" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.392329 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" event={"ID":"40014780-8cb8-47fa-8b2c-c4fb7d04a85c","Type":"ContainerStarted","Data":"620b99a36572573555d9edafa537b030d63de93db8e78e98b8a825c1b5d40fa1"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.392470 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" event={"ID":"40014780-8cb8-47fa-8b2c-c4fb7d04a85c","Type":"ContainerStarted","Data":"2e8fb3a8c94c2c8e1f3885b168e57485f477a668535e95bf79624d6a8b059e8e"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.405695 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" event={"ID":"10dc7791-eebd-49e9-8d9c-63711119e9d7","Type":"ContainerStarted","Data":"20d8ad17fce889c0891718ad487bd407853d271e12febcfcd6d0d77ebb01b23e"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.415619 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" event={"ID":"0368223e-2e01-4681-a7a6-67b77387f8d8","Type":"ContainerStarted","Data":"c99d11bf14a3d22b9bf9f8b3d4a725f2ad066e9650b4eeed3e95098655e3adb9"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.431324 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" event={"ID":"14f49362-2145-40aa-8a7c-e07c70ea910c","Type":"ContainerStarted","Data":"ecf0ef17f4f1af0a2d4aef88de053db8b68324f615935efcde694faed132643f"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.450059 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-fhqz6" podStartSLOduration=163.450042258 podStartE2EDuration="2m43.450042258s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.448612985 +0000 UTC m=+227.776399661" watchObservedRunningTime="2026-03-21 04:27:03.450042258 +0000 UTC m=+227.777828934" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.458886 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hkg98" event={"ID":"f7156267-6917-4c54-ba75-4a91a0772025","Type":"ContainerStarted","Data":"73f7a1853122a3d755ea4ceef3910803fbdfe0ade43faeec6c9d48fc144f223b"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.460006 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.468873 4839 patch_prober.go:28] interesting pod/console-operator-58897d9998-hkg98 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.469197 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-hkg98" podUID="f7156267-6917-4c54-ba75-4a91a0772025" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.469343 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" event={"ID":"a83789bf-1523-4d5e-892d-6597aed01b7d","Type":"ContainerStarted","Data":"30b49456a6f5e32ba761b3f2f18fba90c9260b2efe733bf0886236747cab63b8"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.469482 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" event={"ID":"a83789bf-1523-4d5e-892d-6597aed01b7d","Type":"ContainerStarted","Data":"438e2b25c5a34df45dc61cbf2b0e52c20429a7a022e8882c559f2d615dc2395e"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.470047 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.482176 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:03 crc kubenswrapper[4839]: E0321 04:27:03.483252 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:03.98323609 +0000 UTC m=+228.311022766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.486168 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" podStartSLOduration=163.486105936 podStartE2EDuration="2m43.486105936s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.478100587 +0000 UTC m=+227.805887273" watchObservedRunningTime="2026-03-21 04:27:03.486105936 +0000 UTC m=+227.813892612" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.488645 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" event={"ID":"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8","Type":"ContainerStarted","Data":"3c79577dadd9928c69db7c80eda570ad44512d4d86f82078636cc99a6209ca04"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.495477 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" event={"ID":"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9","Type":"ContainerStarted","Data":"42e839eac1f8215c88389e8183b602d5bf8635c8709ad35ed5bbab8cd3660612"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.495508 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" event={"ID":"34efe2c8-d7a8-47c1-8890-85ebd5ef1eb9","Type":"ContainerStarted","Data":"f687bfd38b7cfa3b591f3fb6b760992a834ed7768fb5ee365a16b165dfb16871"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.496639 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bj929" event={"ID":"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2","Type":"ContainerStarted","Data":"fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.499095 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" event={"ID":"a79fc033-c671-42ff-aa06-78ae64967c92","Type":"ContainerStarted","Data":"d0f9f8438d4a07c654faa0f7a003efabe25d31ca356650fe31683d5c6c32e350"} Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.499485 4839 patch_prober.go:28] interesting pod/downloads-7954f5f757-qp8mz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.499519 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qp8mz" podUID="4d63cdfd-21e7-4a63-960b-363fb131ac08" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.500394 4839 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xldvn container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.500439 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" podUID="a83789bf-1523-4d5e-892d-6597aed01b7d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.503953 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" podStartSLOduration=163.503935029 podStartE2EDuration="2m43.503935029s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.501056383 +0000 UTC m=+227.828843059" watchObservedRunningTime="2026-03-21 04:27:03.503935029 +0000 UTC m=+227.831721705" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.517135 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.522292 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.531504 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-whlp9" podStartSLOduration=162.531487883 podStartE2EDuration="2m42.531487883s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.529853774 +0000 UTC m=+227.857640450" watchObservedRunningTime="2026-03-21 04:27:03.531487883 +0000 UTC m=+227.859274559" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.590956 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:03 crc kubenswrapper[4839]: E0321 04:27:03.598994 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.098978631 +0000 UTC m=+228.426765307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.679638 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" podStartSLOduration=162.679621673 podStartE2EDuration="2m42.679621673s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.59055275 +0000 UTC m=+227.918339436" watchObservedRunningTime="2026-03-21 04:27:03.679621673 +0000 UTC m=+228.007408349" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.679977 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4cwc5" podStartSLOduration=163.679970173 podStartE2EDuration="2m43.679970173s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.653002407 +0000 UTC m=+227.980789093" watchObservedRunningTime="2026-03-21 04:27:03.679970173 +0000 UTC m=+228.007756849" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.692160 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:03 crc kubenswrapper[4839]: E0321 04:27:03.692792 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.192759226 +0000 UTC m=+228.520545902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.701083 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:03 crc kubenswrapper[4839]: E0321 04:27:03.706624 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.20660084 +0000 UTC m=+228.534387516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.718609 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.719039 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.723715 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.762059 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2g8w" podStartSLOduration=162.762037957 podStartE2EDuration="2m42.762037957s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.760630585 +0000 UTC m=+228.088417261" watchObservedRunningTime="2026-03-21 04:27:03.762037957 +0000 UTC m=+228.089824633" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.797344 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.814404 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:03 crc kubenswrapper[4839]: E0321 04:27:03.814806 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.314790845 +0000 UTC m=+228.642577521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.915171 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-hkg98" podStartSLOduration=163.915151756 podStartE2EDuration="2m43.915151756s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:03.876383287 +0000 UTC m=+228.204169963" watchObservedRunningTime="2026-03-21 04:27:03.915151756 +0000 UTC m=+228.242938432" Mar 21 04:27:03 crc kubenswrapper[4839]: I0321 04:27:03.916000 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:03 crc kubenswrapper[4839]: E0321 04:27:03.916392 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.416380353 +0000 UTC m=+228.744167029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.004754 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-bj929" podStartSLOduration=164.004740035 podStartE2EDuration="2m44.004740035s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:04.003857678 +0000 UTC m=+228.331644354" watchObservedRunningTime="2026-03-21 04:27:04.004740035 +0000 UTC m=+228.332526711" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.017561 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.017866 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.517832916 +0000 UTC m=+228.845619582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.018023 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.018554 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.518545238 +0000 UTC m=+228.846331914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.118703 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.121262 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.122134 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.122523 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.622507816 +0000 UTC m=+228.950294492 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.225667 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.226034 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.726021572 +0000 UTC m=+229.053808248 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.327109 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.327462 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.827447564 +0000 UTC m=+229.155234240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.428856 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.429240 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:04.929224547 +0000 UTC m=+229.257011223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.508196 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-c4tgz" event={"ID":"14f49362-2145-40aa-8a7c-e07c70ea910c","Type":"ContainerStarted","Data":"759b32f79065b18d44e62b398dade6c55fefe271033c92b34ee0784f8dfe8cf0"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.516145 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" event={"ID":"8c8a6e75-7e5f-41c8-8312-b9d274284f35","Type":"ContainerStarted","Data":"192f801676a6fb3ac811f65cb782f56c185d6472eefb8dd7157fa7aed825fd3c"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.516190 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" event={"ID":"8c8a6e75-7e5f-41c8-8312-b9d274284f35","Type":"ContainerStarted","Data":"c94f6178f711cc6b9a432b444a6693b667b9601efb8005c0fc9c12c421ee6b88"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.519528 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" event={"ID":"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8","Type":"ContainerStarted","Data":"640abcd252c04a2e697fcd252058fb0c5c2913641a12d4b19aa4820b4d21f3e8"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.519560 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" event={"ID":"9fc77b0b-af5d-42f5-8fd1-69ac8d6616d8","Type":"ContainerStarted","Data":"3ac49c6150d492e24dca32f31e74f974cfd0ef0dbbca65933463a2a80fe52ff4"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.523038 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" event={"ID":"10dc7791-eebd-49e9-8d9c-63711119e9d7","Type":"ContainerStarted","Data":"9088442267d0b0a7c928d54abdc6411a5ee0f9d3fd810ea7906b30fdc2e96a20"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.524304 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" event={"ID":"0368223e-2e01-4681-a7a6-67b77387f8d8","Type":"ContainerStarted","Data":"8dc51ff3af9bc295da39ecd84349288a171e09e13e9355c5592ecc0b1f1951e7"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.525595 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-brnnr" event={"ID":"d1439545-f492-4e4c-858c-ec85c5c2a9d9","Type":"ContainerStarted","Data":"952b273280ea2fa328e0ff549acc309e188c1c302012107463ce46bcfe123548"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.530320 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.530503 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.030487815 +0000 UTC m=+229.358274491 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.530617 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.530947 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.030928368 +0000 UTC m=+229.358715044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.536306 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5jhkc" event={"ID":"81a76ad1-da33-4b42-9c0a-d0ada077729a","Type":"ContainerStarted","Data":"534753d8eb130a1f76926fb1632d8209e66157f6c2e8528e30034183ce1dd5b6"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.536361 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5jhkc" event={"ID":"81a76ad1-da33-4b42-9c0a-d0ada077729a","Type":"ContainerStarted","Data":"c891a419f7328b292b732c2c2e4597a95d912488259d8d7416f2deacf7fb0e9f"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.536583 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.541070 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" event={"ID":"685c3b51-a70f-484e-b7db-f98383f75003","Type":"ContainerStarted","Data":"fa84c17e80a9eedcf415cda88681e7867da38af3733d30ac7215c31641fe5e5b"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.541941 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.543555 4839 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-78xr9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.543636 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" podUID="685c3b51-a70f-484e-b7db-f98383f75003" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.552729 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-g9tz2" podStartSLOduration=163.55271316 podStartE2EDuration="2m43.55271316s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:04.549663548 +0000 UTC m=+228.877450224" watchObservedRunningTime="2026-03-21 04:27:04.55271316 +0000 UTC m=+228.880499836" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.557386 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" event={"ID":"2bc52acb-29f0-4f24-a46a-928a529264dc","Type":"ContainerStarted","Data":"fb7cde5cf5ec6466eb7ac005a25f962746239338ef8ea9b8bad7ad12dd56b03c"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.571442 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" event={"ID":"cae2f42f-b7c7-43c7-b397-a8273ea5844b","Type":"ContainerStarted","Data":"b99776b83e097db45e321be0d5a9b804599d310ae32d00cd000d2780e5aa2659"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.574803 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" event={"ID":"ad426123-af7f-45c4-8a6b-bca3c83017be","Type":"ContainerStarted","Data":"60df4a3a0ba78465c0ca36153aa7b3cd78179d02a3f567ab66fe42b9f322cd03"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.576466 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" event={"ID":"a810b51a-5b19-4da9-ad80-05f189d821e4","Type":"ContainerStarted","Data":"d3e33b3ec52738f286e320aed0cd65c111af76df711d2597da407333eb03030b"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.578269 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" event={"ID":"2bceecf8-583d-4e26-9749-f5939280540b","Type":"ContainerStarted","Data":"4d3879596d130bc865ecded5b7f57a37807b79fb1f74f6290336cf5643820f26"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.579017 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.579969 4839 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vg8dq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.580001 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" podUID="2bceecf8-583d-4e26-9749-f5939280540b" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.589487 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt" event={"ID":"7f2c6e22-6a88-4c63-9da2-e38b813e0f1c","Type":"ContainerStarted","Data":"6390bd19477356598926f491e42af1c0e0ea94e9190a2096268f962427d67244"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.604104 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" event={"ID":"9d291bc8-87c0-4a9e-b269-52a7801f050b","Type":"ContainerStarted","Data":"230f8693273a2a0a0adeb0f9051bca00e6fc233bce7b1294e58215a7b0da83a8"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.604148 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" event={"ID":"9d291bc8-87c0-4a9e-b269-52a7801f050b","Type":"ContainerStarted","Data":"c92ecc3ea986b94ba6d739b8e9f5cef071893679a7c30802741401c0db211e81"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.614151 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" event={"ID":"b60d6f1b-b109-4fa4-a85d-ebb845b342bd","Type":"ContainerStarted","Data":"fd22ccf9124a36ac68409088a34c8e8b0b1dc177591b8e7ff82797fe49c5947d"} Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.614204 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.620741 4839 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xldvn container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.620802 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" podUID="a83789bf-1523-4d5e-892d-6597aed01b7d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.624848 4839 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8jgh7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.624879 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" podUID="6240548e-b827-4fdb-b2be-c7187d6a28e8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.625015 4839 patch_prober.go:28] interesting pod/console-operator-58897d9998-hkg98 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.625077 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-hkg98" podUID="f7156267-6917-4c54-ba75-4a91a0772025" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.632213 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.633631 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.133610119 +0000 UTC m=+229.461396795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.708122 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-brnnr" podStartSLOduration=7.708103046 podStartE2EDuration="7.708103046s" podCreationTimestamp="2026-03-21 04:26:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:04.596808918 +0000 UTC m=+228.924595594" watchObservedRunningTime="2026-03-21 04:27:04.708103046 +0000 UTC m=+229.035889722" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.730399 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5jhkc" podStartSLOduration=7.730372292 podStartE2EDuration="7.730372292s" podCreationTimestamp="2026-03-21 04:26:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:04.679267394 +0000 UTC m=+229.007054090" watchObservedRunningTime="2026-03-21 04:27:04.730372292 +0000 UTC m=+229.058158968" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.737159 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.740988 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:04 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:04 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:04 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.741062 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.743163 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.243144474 +0000 UTC m=+229.570931150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.748074 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" podStartSLOduration=163.748051031 podStartE2EDuration="2m43.748051031s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:04.706518019 +0000 UTC m=+229.034304695" watchObservedRunningTime="2026-03-21 04:27:04.748051031 +0000 UTC m=+229.075837707" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.797497 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9m6tl" podStartSLOduration=164.797469079 podStartE2EDuration="2m44.797469079s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:04.741496065 +0000 UTC m=+229.069282741" watchObservedRunningTime="2026-03-21 04:27:04.797469079 +0000 UTC m=+229.125255755" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.838952 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.839344 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.33932825 +0000 UTC m=+229.667114926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.863222 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.866370 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.876013 4839 patch_prober.go:28] interesting pod/apiserver-76f77b778f-gl7rc container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.876080 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" podUID="9d291bc8-87c0-4a9e-b269-52a7801f050b" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.923014 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" podStartSLOduration=164.922999982 podStartE2EDuration="2m44.922999982s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:04.922197998 +0000 UTC m=+229.249984674" watchObservedRunningTime="2026-03-21 04:27:04.922999982 +0000 UTC m=+229.250786658" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.925281 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hmqv7" podStartSLOduration=163.92525486 podStartE2EDuration="2m43.92525486s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:04.86106656 +0000 UTC m=+229.188853236" watchObservedRunningTime="2026-03-21 04:27:04.92525486 +0000 UTC m=+229.253041536" Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.940482 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:04 crc kubenswrapper[4839]: E0321 04:27:04.940861 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.440848946 +0000 UTC m=+229.768635622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:04 crc kubenswrapper[4839]: I0321 04:27:04.995523 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-6rrrs" podStartSLOduration=163.99550188 podStartE2EDuration="2m43.99550188s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:04.995248543 +0000 UTC m=+229.323035229" watchObservedRunningTime="2026-03-21 04:27:04.99550188 +0000 UTC m=+229.323288556" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.042551 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.043013 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.542995391 +0000 UTC m=+229.870782067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.101665 4839 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-85pc8 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 21 04:27:05 crc kubenswrapper[4839]: [+]log ok Mar 21 04:27:05 crc kubenswrapper[4839]: [+]etcd ok Mar 21 04:27:05 crc kubenswrapper[4839]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 21 04:27:05 crc kubenswrapper[4839]: [-]poststarthook/generic-apiserver-start-informers failed: reason withheld Mar 21 04:27:05 crc kubenswrapper[4839]: [+]poststarthook/max-in-flight-filter ok Mar 21 04:27:05 crc kubenswrapper[4839]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 21 04:27:05 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-StartUserInformer ok Mar 21 04:27:05 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-StartOAuthInformer ok Mar 21 04:27:05 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Mar 21 04:27:05 crc kubenswrapper[4839]: livez check failed Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.101722 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" podUID="67adff78-dfe5-440a-80b0-fefd703c3aa7" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.144941 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.145405 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.645392252 +0000 UTC m=+229.973178918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.170948 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" podStartSLOduration=164.170930356 podStartE2EDuration="2m44.170930356s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:05.118587921 +0000 UTC m=+229.446374597" watchObservedRunningTime="2026-03-21 04:27:05.170930356 +0000 UTC m=+229.498717032" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.245719 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.245907 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.745881457 +0000 UTC m=+230.073668133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.246149 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.246474 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.746462055 +0000 UTC m=+230.074248731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.257126 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-lqm8j" podStartSLOduration=164.257105093 podStartE2EDuration="2m44.257105093s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:05.174325298 +0000 UTC m=+229.502111974" watchObservedRunningTime="2026-03-21 04:27:05.257105093 +0000 UTC m=+229.584891769" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.327143 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" podStartSLOduration=165.327124937 podStartE2EDuration="2m45.327124937s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:05.323391585 +0000 UTC m=+229.651178261" watchObservedRunningTime="2026-03-21 04:27:05.327124937 +0000 UTC m=+229.654911613" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.328371 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hlk25" podStartSLOduration=164.328365274 podStartE2EDuration="2m44.328365274s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:05.258507405 +0000 UTC m=+229.586294081" watchObservedRunningTime="2026-03-21 04:27:05.328365274 +0000 UTC m=+229.656151950" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.347945 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.348294 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.848277379 +0000 UTC m=+230.176064055 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.427052 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-g2rrh" podStartSLOduration=165.427035844 podStartE2EDuration="2m45.427035844s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:05.426947302 +0000 UTC m=+229.754733978" watchObservedRunningTime="2026-03-21 04:27:05.427035844 +0000 UTC m=+229.754822520" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.427118 4839 ???:1] "http: TLS handshake error from 192.168.126.11:53464: no serving certificate available for the kubelet" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.449549 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.449887 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:05.949870837 +0000 UTC m=+230.277657503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.495173 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-blcpt" podStartSLOduration=164.495149211 podStartE2EDuration="2m44.495149211s" podCreationTimestamp="2026-03-21 04:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:05.493629966 +0000 UTC m=+229.821416642" watchObservedRunningTime="2026-03-21 04:27:05.495149211 +0000 UTC m=+229.822935887" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.551235 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.551977 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.05195508 +0000 UTC m=+230.379741756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.564134 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43516: no serving certificate available for the kubelet" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.627481 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cstqb" event={"ID":"4fee5524-9cb1-48c7-83b6-10bf3230c783","Type":"ContainerStarted","Data":"b345b9124aa14eedcd6fa38210b1af5914af54e5d8dca691b755df6154040f68"} Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.629705 4839 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-78xr9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.629754 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" podUID="685c3b51-a70f-484e-b7db-f98383f75003" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.640700 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.640818 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xldvn" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.653804 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.656706 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.156691832 +0000 UTC m=+230.484478588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.663926 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43532: no serving certificate available for the kubelet" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.741830 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:05 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:05 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:05 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.741898 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.759941 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.760273 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.260257529 +0000 UTC m=+230.588044205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.803031 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43534: no serving certificate available for the kubelet" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.861325 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.861782 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.361765884 +0000 UTC m=+230.689552560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.905993 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43538: no serving certificate available for the kubelet" Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.962360 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:05 crc kubenswrapper[4839]: E0321 04:27:05.962909 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.462889138 +0000 UTC m=+230.790675824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:05 crc kubenswrapper[4839]: I0321 04:27:05.997867 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43544: no serving certificate available for the kubelet" Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.064197 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.064480 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.564468066 +0000 UTC m=+230.892254742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.108147 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43554: no serving certificate available for the kubelet" Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.165107 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.165297 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.6652677 +0000 UTC m=+230.993054376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.165981 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.166345 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.666330172 +0000 UTC m=+230.994116848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.207314 4839 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-p4nnp container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.207369 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" podUID="b60d6f1b-b109-4fa4-a85d-ebb845b342bd" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.207322 4839 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-p4nnp container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.207579 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" podUID="b60d6f1b-b109-4fa4-a85d-ebb845b342bd" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.251529 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43558: no serving certificate available for the kubelet" Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.267599 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.267733 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.767714683 +0000 UTC m=+231.095501359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.267781 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.268055 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.768047883 +0000 UTC m=+231.095834559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.369248 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.369444 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.869416854 +0000 UTC m=+231.197203530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.369495 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.369844 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.869835227 +0000 UTC m=+231.197621903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.475556 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.475675 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.975652611 +0000 UTC m=+231.303439297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.475905 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.476331 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:06.976320611 +0000 UTC m=+231.304107287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.577423 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.577719 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:07.077702423 +0000 UTC m=+231.405489099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.631177 4839 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vg8dq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.631232 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" podUID="2bceecf8-583d-4e26-9749-f5939280540b" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.632609 4839 patch_prober.go:28] interesting pod/console-operator-58897d9998-hkg98 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.632646 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-hkg98" podUID="f7156267-6917-4c54-ba75-4a91a0772025" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.660006 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-78xr9" Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.680604 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.683228 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:07.183209848 +0000 UTC m=+231.510996594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.731793 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:06 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:06 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:06 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.731840 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.782075 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.782405 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:07.282389753 +0000 UTC m=+231.610176429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.885302 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.885659 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:07.385646481 +0000 UTC m=+231.713433157 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.958978 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43566: no serving certificate available for the kubelet" Mar 21 04:27:06 crc kubenswrapper[4839]: I0321 04:27:06.985872 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:06 crc kubenswrapper[4839]: E0321 04:27:06.986390 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:07.486359382 +0000 UTC m=+231.814146068 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.088412 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:07 crc kubenswrapper[4839]: E0321 04:27:07.088829 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:07.588812766 +0000 UTC m=+231.916599442 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.186505 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nw7r6"] Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.187678 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.188998 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:07 crc kubenswrapper[4839]: E0321 04:27:07.189306 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:07.689293041 +0000 UTC m=+232.017079717 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.192173 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.229347 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nw7r6"] Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.290275 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-catalog-content\") pod \"certified-operators-nw7r6\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.290367 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.290407 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krww4\" (UniqueName: \"kubernetes.io/projected/65a571df-f531-458b-9aed-6de99e4607e1-kube-api-access-krww4\") pod \"certified-operators-nw7r6\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.290464 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-utilities\") pod \"certified-operators-nw7r6\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: E0321 04:27:07.290802 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:07.790786785 +0000 UTC m=+232.118573461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.366400 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mxrc8"] Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.367946 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.374473 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.389962 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxrc8"] Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.391499 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.391862 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-catalog-content\") pod \"certified-operators-nw7r6\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.391939 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krww4\" (UniqueName: \"kubernetes.io/projected/65a571df-f531-458b-9aed-6de99e4607e1-kube-api-access-krww4\") pod \"certified-operators-nw7r6\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.392027 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-utilities\") pod \"certified-operators-nw7r6\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.392473 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-utilities\") pod \"certified-operators-nw7r6\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: E0321 04:27:07.392558 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:07.892540318 +0000 UTC m=+232.220326994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.392846 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-catalog-content\") pod \"certified-operators-nw7r6\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.401408 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-45jfn"] Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.401685 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" podUID="3498feaf-72d5-471a-b25e-fb4b68875767" containerName="controller-manager" containerID="cri-o://3be930827d4d25e493d6f8a65c67c37e9fd4d484ec4f005269d96061246be637" gracePeriod=30 Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.435048 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krww4\" (UniqueName: \"kubernetes.io/projected/65a571df-f531-458b-9aed-6de99e4607e1-kube-api-access-krww4\") pod \"certified-operators-nw7r6\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.458367 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz"] Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.458631 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" podUID="e81e2384-94b0-4639-bb2d-e4152385c932" containerName="route-controller-manager" containerID="cri-o://e0cf19f30a06660139a2f20b41696b02c7e76a7a7208cb22acb223aa74d717d6" gracePeriod=30 Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.493922 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:07 crc kubenswrapper[4839]: E0321 04:27:07.494275 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:07.99426183 +0000 UTC m=+232.322048496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.494521 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-utilities\") pod \"community-operators-mxrc8\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.494552 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt4t7\" (UniqueName: \"kubernetes.io/projected/6513c45b-dd98-40b0-b69c-94db4d1c916e-kube-api-access-nt4t7\") pod \"community-operators-mxrc8\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.494758 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-catalog-content\") pod \"community-operators-mxrc8\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.568402 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qrqj2"] Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.569427 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.576506 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.591263 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qrqj2"] Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.595606 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.595914 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-catalog-content\") pod \"community-operators-mxrc8\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.596032 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-utilities\") pod \"community-operators-mxrc8\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.596064 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt4t7\" (UniqueName: \"kubernetes.io/projected/6513c45b-dd98-40b0-b69c-94db4d1c916e-kube-api-access-nt4t7\") pod \"community-operators-mxrc8\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: E0321 04:27:07.596386 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.096365203 +0000 UTC m=+232.424151879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.596840 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-catalog-content\") pod \"community-operators-mxrc8\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.596872 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-utilities\") pod \"community-operators-mxrc8\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.642117 4839 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vg8dq container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.642172 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" podUID="2bceecf8-583d-4e26-9749-f5939280540b" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.663286 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt4t7\" (UniqueName: \"kubernetes.io/projected/6513c45b-dd98-40b0-b69c-94db4d1c916e-kube-api-access-nt4t7\") pod \"community-operators-mxrc8\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.683352 4839 generic.go:334] "Generic (PLEG): container finished" podID="3498feaf-72d5-471a-b25e-fb4b68875767" containerID="3be930827d4d25e493d6f8a65c67c37e9fd4d484ec4f005269d96061246be637" exitCode=0 Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.683451 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" event={"ID":"3498feaf-72d5-471a-b25e-fb4b68875767","Type":"ContainerDied","Data":"3be930827d4d25e493d6f8a65c67c37e9fd4d484ec4f005269d96061246be637"} Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.693960 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.697100 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-utilities\") pod \"certified-operators-qrqj2\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.697192 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-catalog-content\") pod \"certified-operators-qrqj2\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.697224 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.697318 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7dr2\" (UniqueName: \"kubernetes.io/projected/f1ec80e5-557b-4c30-8323-87d6b1447a6d-kube-api-access-z7dr2\") pod \"certified-operators-qrqj2\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:07 crc kubenswrapper[4839]: E0321 04:27:07.697733 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.197717984 +0000 UTC m=+232.525504660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.709858 4839 generic.go:334] "Generic (PLEG): container finished" podID="e81e2384-94b0-4639-bb2d-e4152385c932" containerID="e0cf19f30a06660139a2f20b41696b02c7e76a7a7208cb22acb223aa74d717d6" exitCode=0 Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.709941 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" event={"ID":"e81e2384-94b0-4639-bb2d-e4152385c932","Type":"ContainerDied","Data":"e0cf19f30a06660139a2f20b41696b02c7e76a7a7208cb22acb223aa74d717d6"} Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.714178 4839 generic.go:334] "Generic (PLEG): container finished" podID="0368223e-2e01-4681-a7a6-67b77387f8d8" containerID="8dc51ff3af9bc295da39ecd84349288a171e09e13e9355c5592ecc0b1f1951e7" exitCode=0 Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.715293 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" event={"ID":"0368223e-2e01-4681-a7a6-67b77387f8d8","Type":"ContainerDied","Data":"8dc51ff3af9bc295da39ecd84349288a171e09e13e9355c5592ecc0b1f1951e7"} Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.728829 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:07 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:07 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:07 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.728908 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.772157 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v4btp"] Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.773377 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.789191 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v4btp"] Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.799388 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.799630 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-utilities\") pod \"certified-operators-qrqj2\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.799704 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-catalog-content\") pod \"certified-operators-qrqj2\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.799748 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7dr2\" (UniqueName: \"kubernetes.io/projected/f1ec80e5-557b-4c30-8323-87d6b1447a6d-kube-api-access-z7dr2\") pod \"certified-operators-qrqj2\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:07 crc kubenswrapper[4839]: E0321 04:27:07.800085 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.300070454 +0000 UTC m=+232.627857130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.801128 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-utilities\") pod \"certified-operators-qrqj2\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.801940 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-catalog-content\") pod \"certified-operators-qrqj2\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.826262 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7dr2\" (UniqueName: \"kubernetes.io/projected/f1ec80e5-557b-4c30-8323-87d6b1447a6d-kube-api-access-z7dr2\") pod \"certified-operators-qrqj2\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.900660 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.900767 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-utilities\") pod \"community-operators-v4btp\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.900828 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9sw7\" (UniqueName: \"kubernetes.io/projected/dc99f39a-8001-466b-acf1-bd106eb2b81d-kube-api-access-w9sw7\") pod \"community-operators-v4btp\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.900897 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-catalog-content\") pod \"community-operators-v4btp\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:07 crc kubenswrapper[4839]: E0321 04:27:07.901041 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.401025512 +0000 UTC m=+232.728812188 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.947932 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:27:07 crc kubenswrapper[4839]: I0321 04:27:07.999888 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.005192 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.005311 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.50529033 +0000 UTC m=+232.833077006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.005813 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-client-ca\") pod \"3498feaf-72d5-471a-b25e-fb4b68875767\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.005865 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp2sf\" (UniqueName: \"kubernetes.io/projected/3498feaf-72d5-471a-b25e-fb4b68875767-kube-api-access-wp2sf\") pod \"3498feaf-72d5-471a-b25e-fb4b68875767\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.005899 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-proxy-ca-bundles\") pod \"3498feaf-72d5-471a-b25e-fb4b68875767\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.005933 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3498feaf-72d5-471a-b25e-fb4b68875767-serving-cert\") pod \"3498feaf-72d5-471a-b25e-fb4b68875767\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.006025 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-config\") pod \"3498feaf-72d5-471a-b25e-fb4b68875767\" (UID: \"3498feaf-72d5-471a-b25e-fb4b68875767\") " Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.006284 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9sw7\" (UniqueName: \"kubernetes.io/projected/dc99f39a-8001-466b-acf1-bd106eb2b81d-kube-api-access-w9sw7\") pod \"community-operators-v4btp\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.006864 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-client-ca" (OuterVolumeSpecName: "client-ca") pod "3498feaf-72d5-471a-b25e-fb4b68875767" (UID: "3498feaf-72d5-471a-b25e-fb4b68875767"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.007510 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3498feaf-72d5-471a-b25e-fb4b68875767" (UID: "3498feaf-72d5-471a-b25e-fb4b68875767"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.007556 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-config" (OuterVolumeSpecName: "config") pod "3498feaf-72d5-471a-b25e-fb4b68875767" (UID: "3498feaf-72d5-471a-b25e-fb4b68875767"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.007798 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-catalog-content\") pod \"community-operators-v4btp\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.008056 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.008145 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-utilities\") pod \"community-operators-v4btp\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.008273 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.008299 4839 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.008315 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3498feaf-72d5-471a-b25e-fb4b68875767-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.009023 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-utilities\") pod \"community-operators-v4btp\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.009961 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-catalog-content\") pod \"community-operators-v4btp\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.010309 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.510288379 +0000 UTC m=+232.838075205 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.013063 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3498feaf-72d5-471a-b25e-fb4b68875767-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3498feaf-72d5-471a-b25e-fb4b68875767" (UID: "3498feaf-72d5-471a-b25e-fb4b68875767"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.014897 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3498feaf-72d5-471a-b25e-fb4b68875767-kube-api-access-wp2sf" (OuterVolumeSpecName: "kube-api-access-wp2sf") pod "3498feaf-72d5-471a-b25e-fb4b68875767" (UID: "3498feaf-72d5-471a-b25e-fb4b68875767"). InnerVolumeSpecName "kube-api-access-wp2sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.029353 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9sw7\" (UniqueName: \"kubernetes.io/projected/dc99f39a-8001-466b-acf1-bd106eb2b81d-kube-api-access-w9sw7\") pod \"community-operators-v4btp\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.082451 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.115532 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.115903 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.615865456 +0000 UTC m=+232.943652152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.116093 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.116277 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp2sf\" (UniqueName: \"kubernetes.io/projected/3498feaf-72d5-471a-b25e-fb4b68875767-kube-api-access-wp2sf\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.116300 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3498feaf-72d5-471a-b25e-fb4b68875767-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.116692 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.616680301 +0000 UTC m=+232.944466977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.124112 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.158204 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxrc8"] Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.216956 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e81e2384-94b0-4639-bb2d-e4152385c932-serving-cert\") pod \"e81e2384-94b0-4639-bb2d-e4152385c932\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.217148 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.217180 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-config\") pod \"e81e2384-94b0-4639-bb2d-e4152385c932\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.217201 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-client-ca\") pod \"e81e2384-94b0-4639-bb2d-e4152385c932\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.217262 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64b9q\" (UniqueName: \"kubernetes.io/projected/e81e2384-94b0-4639-bb2d-e4152385c932-kube-api-access-64b9q\") pod \"e81e2384-94b0-4639-bb2d-e4152385c932\" (UID: \"e81e2384-94b0-4639-bb2d-e4152385c932\") " Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.217898 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-config" (OuterVolumeSpecName: "config") pod "e81e2384-94b0-4639-bb2d-e4152385c932" (UID: "e81e2384-94b0-4639-bb2d-e4152385c932"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.218395 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-client-ca" (OuterVolumeSpecName: "client-ca") pod "e81e2384-94b0-4639-bb2d-e4152385c932" (UID: "e81e2384-94b0-4639-bb2d-e4152385c932"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.218479 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.718454324 +0000 UTC m=+233.046241050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.222697 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81e2384-94b0-4639-bb2d-e4152385c932-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e81e2384-94b0-4639-bb2d-e4152385c932" (UID: "e81e2384-94b0-4639-bb2d-e4152385c932"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.225943 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81e2384-94b0-4639-bb2d-e4152385c932-kube-api-access-64b9q" (OuterVolumeSpecName: "kube-api-access-64b9q") pod "e81e2384-94b0-4639-bb2d-e4152385c932" (UID: "e81e2384-94b0-4639-bb2d-e4152385c932"). InnerVolumeSpecName "kube-api-access-64b9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.286421 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43574: no serving certificate available for the kubelet" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.313213 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nw7r6"] Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.320547 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.320771 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e81e2384-94b0-4639-bb2d-e4152385c932-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.320795 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.320806 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e81e2384-94b0-4639-bb2d-e4152385c932-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.320819 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64b9q\" (UniqueName: \"kubernetes.io/projected/e81e2384-94b0-4639-bb2d-e4152385c932-kube-api-access-64b9q\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.320961 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.820945209 +0000 UTC m=+233.148731885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.328424 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qrqj2"] Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.422129 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.424119 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:08.922508536 +0000 UTC m=+233.250295212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.448064 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v4btp"] Mar 21 04:27:08 crc kubenswrapper[4839]: W0321 04:27:08.472845 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc99f39a_8001_466b_acf1_bd106eb2b81d.slice/crio-a5776e6c987dacb310eadbac22656829eea56efcfa2c6987693a184baa498a40 WatchSource:0}: Error finding container a5776e6c987dacb310eadbac22656829eea56efcfa2c6987693a184baa498a40: Status 404 returned error can't find the container with id a5776e6c987dacb310eadbac22656829eea56efcfa2c6987693a184baa498a40 Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.523644 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.524080 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.024042852 +0000 UTC m=+233.351829528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.624658 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.624789 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.124757624 +0000 UTC m=+233.452544300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.625110 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.625529 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.125511646 +0000 UTC m=+233.453298322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.720743 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:08 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:08 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:08 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.720805 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.726075 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.726256 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.226234998 +0000 UTC m=+233.554021694 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.726375 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.726720 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.226709483 +0000 UTC m=+233.554496159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.730206 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" event={"ID":"3498feaf-72d5-471a-b25e-fb4b68875767","Type":"ContainerDied","Data":"37e1b23e6295895c07381fb6ccbe7c11a0a79e23312f6edb20749b2a0cf5c684"} Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.730252 4839 scope.go:117] "RemoveContainer" containerID="3be930827d4d25e493d6f8a65c67c37e9fd4d484ec4f005269d96061246be637" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.730428 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-45jfn" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.738097 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrqj2" event={"ID":"f1ec80e5-557b-4c30-8323-87d6b1447a6d","Type":"ContainerStarted","Data":"45c4f382e92761207baa8e2c4160a24c616ae580e9d48096eb767d7eab157d90"} Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.740271 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.740861 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz" event={"ID":"e81e2384-94b0-4639-bb2d-e4152385c932","Type":"ContainerDied","Data":"14f69a0375e6c5d03a334a0b16024be0f89d91b54fd01559c41e7673087b6d53"} Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.743870 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxrc8" event={"ID":"6513c45b-dd98-40b0-b69c-94db4d1c916e","Type":"ContainerStarted","Data":"3c535ea31a5aa838095ae16f33b0780c50c3c3698c73e47ae8e3f30c17a3ac39"} Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.751533 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nw7r6" event={"ID":"65a571df-f531-458b-9aed-6de99e4607e1","Type":"ContainerStarted","Data":"b5f435157e1b2e816a83545c0d59dbf17d3143a0eb363bf4e4b546731c0c8b35"} Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.758852 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4btp" event={"ID":"dc99f39a-8001-466b-acf1-bd106eb2b81d","Type":"ContainerStarted","Data":"a5776e6c987dacb310eadbac22656829eea56efcfa2c6987693a184baa498a40"} Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.777633 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-45jfn"] Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.788084 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-45jfn"] Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.810644 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz"] Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.811220 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-76ctz"] Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.827158 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.827357 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.327326031 +0000 UTC m=+233.655112707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.827504 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.827862 4839 scope.go:117] "RemoveContainer" containerID="e0cf19f30a06660139a2f20b41696b02c7e76a7a7208cb22acb223aa74d717d6" Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.828056 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.328046483 +0000 UTC m=+233.655833159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.929033 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.929245 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.429216108 +0000 UTC m=+233.757002784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:08 crc kubenswrapper[4839]: I0321 04:27:08.929387 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:08 crc kubenswrapper[4839]: E0321 04:27:08.929758 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.429750304 +0000 UTC m=+233.757537050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.030366 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.031090 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.531071814 +0000 UTC m=+233.858858490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.101389 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.127459 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.132169 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.132682 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.632667862 +0000 UTC m=+233.960454538 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.133311 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85pc8" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.219639 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p4nnp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.232908 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9wch\" (UniqueName: \"kubernetes.io/projected/0368223e-2e01-4681-a7a6-67b77387f8d8-kube-api-access-l9wch\") pod \"0368223e-2e01-4681-a7a6-67b77387f8d8\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.232963 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0368223e-2e01-4681-a7a6-67b77387f8d8-config-volume\") pod \"0368223e-2e01-4681-a7a6-67b77387f8d8\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.233088 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.233130 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0368223e-2e01-4681-a7a6-67b77387f8d8-secret-volume\") pod \"0368223e-2e01-4681-a7a6-67b77387f8d8\" (UID: \"0368223e-2e01-4681-a7a6-67b77387f8d8\") " Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.235628 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0368223e-2e01-4681-a7a6-67b77387f8d8-config-volume" (OuterVolumeSpecName: "config-volume") pod "0368223e-2e01-4681-a7a6-67b77387f8d8" (UID: "0368223e-2e01-4681-a7a6-67b77387f8d8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.235699 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.735685532 +0000 UTC m=+234.063472198 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.240762 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0368223e-2e01-4681-a7a6-67b77387f8d8-kube-api-access-l9wch" (OuterVolumeSpecName: "kube-api-access-l9wch") pod "0368223e-2e01-4681-a7a6-67b77387f8d8" (UID: "0368223e-2e01-4681-a7a6-67b77387f8d8"). InnerVolumeSpecName "kube-api-access-l9wch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.242223 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0368223e-2e01-4681-a7a6-67b77387f8d8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0368223e-2e01-4681-a7a6-67b77387f8d8" (UID: "0368223e-2e01-4681-a7a6-67b77387f8d8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.289488 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.290002 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3498feaf-72d5-471a-b25e-fb4b68875767" containerName="controller-manager" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.290022 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="3498feaf-72d5-471a-b25e-fb4b68875767" containerName="controller-manager" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.290033 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81e2384-94b0-4639-bb2d-e4152385c932" containerName="route-controller-manager" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.290039 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81e2384-94b0-4639-bb2d-e4152385c932" containerName="route-controller-manager" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.290054 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0368223e-2e01-4681-a7a6-67b77387f8d8" containerName="collect-profiles" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.290286 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="0368223e-2e01-4681-a7a6-67b77387f8d8" containerName="collect-profiles" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.290530 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="0368223e-2e01-4681-a7a6-67b77387f8d8" containerName="collect-profiles" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.290546 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81e2384-94b0-4639-bb2d-e4152385c932" containerName="route-controller-manager" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.290581 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="3498feaf-72d5-471a-b25e-fb4b68875767" containerName="controller-manager" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.291181 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.296132 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.297005 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.299103 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.335472 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.335804 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.835784326 +0000 UTC m=+234.163571002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.338705 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e964573d-8ca6-4f88-8754-f34b3aa57504-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e964573d-8ca6-4f88-8754-f34b3aa57504\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.338740 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e964573d-8ca6-4f88-8754-f34b3aa57504-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e964573d-8ca6-4f88-8754-f34b3aa57504\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.338932 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9wch\" (UniqueName: \"kubernetes.io/projected/0368223e-2e01-4681-a7a6-67b77387f8d8-kube-api-access-l9wch\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.338961 4839 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0368223e-2e01-4681-a7a6-67b77387f8d8-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.338972 4839 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0368223e-2e01-4681-a7a6-67b77387f8d8-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.362811 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9qjgq"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.363752 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.366749 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.372156 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qjgq"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.440087 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.440272 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.940248869 +0000 UTC m=+234.268035545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.440331 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-catalog-content\") pod \"redhat-marketplace-9qjgq\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.440385 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.440413 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e964573d-8ca6-4f88-8754-f34b3aa57504-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e964573d-8ca6-4f88-8754-f34b3aa57504\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.440439 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e964573d-8ca6-4f88-8754-f34b3aa57504-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e964573d-8ca6-4f88-8754-f34b3aa57504\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.440462 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncnmc\" (UniqueName: \"kubernetes.io/projected/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-kube-api-access-ncnmc\") pod \"redhat-marketplace-9qjgq\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.440483 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-utilities\") pod \"redhat-marketplace-9qjgq\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.440815 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:09.940803106 +0000 UTC m=+234.268589782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.440854 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e964573d-8ca6-4f88-8754-f34b3aa57504-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e964573d-8ca6-4f88-8754-f34b3aa57504\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.461018 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.461656 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.464681 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.465904 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e964573d-8ca6-4f88-8754-f34b3aa57504-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e964573d-8ca6-4f88-8754-f34b3aa57504\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.466162 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.470325 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.541983 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.542226 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5b496f66-c844-4eed-b91d-7c0b6b796e5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.542294 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncnmc\" (UniqueName: \"kubernetes.io/projected/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-kube-api-access-ncnmc\") pod \"redhat-marketplace-9qjgq\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.542321 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-utilities\") pod \"redhat-marketplace-9qjgq\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.542357 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5b496f66-c844-4eed-b91d-7c0b6b796e5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.542390 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-catalog-content\") pod \"redhat-marketplace-9qjgq\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.542752 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:10.042724064 +0000 UTC m=+234.370510740 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.542859 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-catalog-content\") pod \"redhat-marketplace-9qjgq\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.543115 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-utilities\") pod \"redhat-marketplace-9qjgq\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.562357 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncnmc\" (UniqueName: \"kubernetes.io/projected/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-kube-api-access-ncnmc\") pod \"redhat-marketplace-9qjgq\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.632662 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.643887 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.643932 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5b496f66-c844-4eed-b91d-7c0b6b796e5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.644031 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5b496f66-c844-4eed-b91d-7c0b6b796e5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.644122 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5b496f66-c844-4eed-b91d-7c0b6b796e5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.644347 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:10.144329472 +0000 UTC m=+234.472116148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.671443 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5b496f66-c844-4eed-b91d-7c0b6b796e5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.683172 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.689021 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5fbc589df6-8mjvg"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.689874 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.700143 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.700321 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.700489 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.700619 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.700733 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.700845 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.704346 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.705321 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.707553 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5fbc589df6-8mjvg"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.715913 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.722713 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:09 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:09 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:09 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.722794 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.731434 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.731993 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.732126 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.732341 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.739474 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.739855 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.742847 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.744942 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.745328 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-client-ca\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.745458 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e24bacec-594f-429f-8e02-73abc6c4b092-serving-cert\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.745585 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:10.245532728 +0000 UTC m=+234.573319564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.745656 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5hv7\" (UniqueName: \"kubernetes.io/projected/e24bacec-594f-429f-8e02-73abc6c4b092-kube-api-access-s5hv7\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.745823 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-config\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.745930 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-client-ca\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.746102 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhfng\" (UniqueName: \"kubernetes.io/projected/8805db9c-11be-498e-9f1f-7bc6914dba76-kube-api-access-vhfng\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.746131 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-config\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.746203 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-proxy-ca-bundles\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.746222 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8805db9c-11be-498e-9f1f-7bc6914dba76-serving-cert\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.746316 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.746818 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:10.246789676 +0000 UTC m=+234.574576562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.781982 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7m29z"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.804354 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.805280 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.805402 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrqj2" event={"ID":"f1ec80e5-557b-4c30-8323-87d6b1447a6d","Type":"ContainerDied","Data":"5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250"} Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.805278 4839 generic.go:334] "Generic (PLEG): container finished" podID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerID="5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250" exitCode=0 Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.807916 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7m29z"] Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.814041 4839 generic.go:334] "Generic (PLEG): container finished" podID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerID="ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127" exitCode=0 Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.814349 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxrc8" event={"ID":"6513c45b-dd98-40b0-b69c-94db4d1c916e","Type":"ContainerDied","Data":"ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127"} Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.817675 4839 generic.go:334] "Generic (PLEG): container finished" podID="65a571df-f531-458b-9aed-6de99e4607e1" containerID="b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6" exitCode=0 Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.817739 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nw7r6" event={"ID":"65a571df-f531-458b-9aed-6de99e4607e1","Type":"ContainerDied","Data":"b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6"} Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.824311 4839 generic.go:334] "Generic (PLEG): container finished" podID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerID="87e7256c1b35efeb4f01906aa88cf63b70ae781a00455690c43c1caf1c568dc7" exitCode=0 Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.824373 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4btp" event={"ID":"dc99f39a-8001-466b-acf1-bd106eb2b81d","Type":"ContainerDied","Data":"87e7256c1b35efeb4f01906aa88cf63b70ae781a00455690c43c1caf1c568dc7"} Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.844684 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" event={"ID":"0368223e-2e01-4681-a7a6-67b77387f8d8","Type":"ContainerDied","Data":"c99d11bf14a3d22b9bf9f8b3d4a725f2ad066e9650b4eeed3e95098655e3adb9"} Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.844728 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c99d11bf14a3d22b9bf9f8b3d4a725f2ad066e9650b4eeed3e95098655e3adb9" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.844800 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.848675 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:10.348658172 +0000 UTC m=+234.676444848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.848715 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.848898 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-catalog-content\") pod \"redhat-marketplace-7m29z\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.849021 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5hv7\" (UniqueName: \"kubernetes.io/projected/e24bacec-594f-429f-8e02-73abc6c4b092-kube-api-access-s5hv7\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.849701 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57m7l\" (UniqueName: \"kubernetes.io/projected/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-kube-api-access-57m7l\") pod \"redhat-marketplace-7m29z\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.850385 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-config\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.851779 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-config\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.851868 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-client-ca\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.853731 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-utilities\") pod \"redhat-marketplace-7m29z\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.853810 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhfng\" (UniqueName: \"kubernetes.io/projected/8805db9c-11be-498e-9f1f-7bc6914dba76-kube-api-access-vhfng\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.853838 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-config\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.853894 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-proxy-ca-bundles\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.853913 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8805db9c-11be-498e-9f1f-7bc6914dba76-serving-cert\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.853976 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.854025 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-client-ca\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.854055 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e24bacec-594f-429f-8e02-73abc6c4b092-serving-cert\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.853558 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-client-ca\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.856896 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:10.356877988 +0000 UTC m=+234.684664664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.857461 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-proxy-ca-bundles\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.863173 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8805db9c-11be-498e-9f1f-7bc6914dba76-serving-cert\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.866038 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cstqb" event={"ID":"4fee5524-9cb1-48c7-83b6-10bf3230c783","Type":"ContainerStarted","Data":"b49869ad25ce4e8eafa49f497261ed5f8cd6333d47345feb9f2037cf9f52cbee"} Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.866897 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-client-ca\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.868518 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-config\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.879652 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5hv7\" (UniqueName: \"kubernetes.io/projected/e24bacec-594f-429f-8e02-73abc6c4b092-kube-api-access-s5hv7\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.880459 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e24bacec-594f-429f-8e02-73abc6c4b092-serving-cert\") pod \"controller-manager-5fbc589df6-8mjvg\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.884927 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhfng\" (UniqueName: \"kubernetes.io/projected/8805db9c-11be-498e-9f1f-7bc6914dba76-kube-api-access-vhfng\") pod \"route-controller-manager-6bbc66d757-nhjsp\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.889096 4839 patch_prober.go:28] interesting pod/apiserver-76f77b778f-gl7rc container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]log ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]etcd ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/generic-apiserver-start-informers ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/max-in-flight-filter ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 21 04:27:09 crc kubenswrapper[4839]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/project.openshift.io-projectcache ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-startinformers ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 21 04:27:09 crc kubenswrapper[4839]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 21 04:27:09 crc kubenswrapper[4839]: livez check failed Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.889175 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" podUID="9d291bc8-87c0-4a9e-b269-52a7801f050b" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.932960 4839 patch_prober.go:28] interesting pod/downloads-7954f5f757-qp8mz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.932997 4839 patch_prober.go:28] interesting pod/downloads-7954f5f757-qp8mz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.933025 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qp8mz" podUID="4d63cdfd-21e7-4a63-960b-363fb131ac08" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.933032 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qp8mz" podUID="4d63cdfd-21e7-4a63-960b-363fb131ac08" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.943304 4839 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.961077 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.961286 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-catalog-content\") pod \"redhat-marketplace-7m29z\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.961337 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57m7l\" (UniqueName: \"kubernetes.io/projected/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-kube-api-access-57m7l\") pod \"redhat-marketplace-7m29z\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.961395 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-utilities\") pod \"redhat-marketplace-7m29z\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:09 crc kubenswrapper[4839]: E0321 04:27:09.961903 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-21 04:27:10.461889888 +0000 UTC m=+234.789676564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.962631 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-catalog-content\") pod \"redhat-marketplace-7m29z\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.964067 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-utilities\") pod \"redhat-marketplace-7m29z\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:09 crc kubenswrapper[4839]: I0321 04:27:09.984168 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57m7l\" (UniqueName: \"kubernetes.io/projected/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-kube-api-access-57m7l\") pod \"redhat-marketplace-7m29z\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.042066 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.055650 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.063346 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:10 crc kubenswrapper[4839]: E0321 04:27:10.063891 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-21 04:27:10.563872707 +0000 UTC m=+234.891659383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ql2ps" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.069893 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:10 crc kubenswrapper[4839]: W0321 04:27:10.092771 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode964573d_8ca6_4f88_8754_f34b3aa57504.slice/crio-a42e32fd6c837587208cca2bfb2a070aca572819f156271c0601f9badb5fbc1d WatchSource:0}: Error finding container a42e32fd6c837587208cca2bfb2a070aca572819f156271c0601f9badb5fbc1d: Status 404 returned error can't find the container with id a42e32fd6c837587208cca2bfb2a070aca572819f156271c0601f9badb5fbc1d Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.127889 4839 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-21T04:27:09.943338713Z","Handler":null,"Name":""} Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.129776 4839 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.129812 4839 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.138470 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.142253 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qjgq"] Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.164403 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.168776 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 21 04:27:10 crc kubenswrapper[4839]: W0321 04:27:10.183367 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b7a7313_21c4_4909_9ebe_ebe552b29b8c.slice/crio-6a5663fd0eb16a90e793ba0b93994b3affe90036f9e0e38ea8915b0da62b0425 WatchSource:0}: Error finding container 6a5663fd0eb16a90e793ba0b93994b3affe90036f9e0e38ea8915b0da62b0425: Status 404 returned error can't find the container with id 6a5663fd0eb16a90e793ba0b93994b3affe90036f9e0e38ea8915b0da62b0425 Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.220225 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.246470 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.246551 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.249174 4839 patch_prober.go:28] interesting pod/console-f9d7485db-bj929 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.249231 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-bj929" podUID="ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.270990 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-hkg98" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.272403 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.280315 4839 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.280359 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.363938 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zgfcm"] Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.365434 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.369886 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.391401 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zgfcm"] Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.401419 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ql2ps\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.420285 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5fbc589df6-8mjvg"] Mar 21 04:27:10 crc kubenswrapper[4839]: W0321 04:27:10.435401 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode24bacec_594f_429f_8e02_73abc6c4b092.slice/crio-788d6540d0d6e5edc7b3a0c3a17787a5f1728b835f1af19ef78772b9e9d77f08 WatchSource:0}: Error finding container 788d6540d0d6e5edc7b3a0c3a17787a5f1728b835f1af19ef78772b9e9d77f08: Status 404 returned error can't find the container with id 788d6540d0d6e5edc7b3a0c3a17787a5f1728b835f1af19ef78772b9e9d77f08 Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.475485 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jctj4\" (UniqueName: \"kubernetes.io/projected/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-kube-api-access-jctj4\") pod \"redhat-operators-zgfcm\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.475550 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-catalog-content\") pod \"redhat-operators-zgfcm\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.475610 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-utilities\") pod \"redhat-operators-zgfcm\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.477308 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3498feaf-72d5-471a-b25e-fb4b68875767" path="/var/lib/kubelet/pods/3498feaf-72d5-471a-b25e-fb4b68875767/volumes" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.478273 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.478823 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e81e2384-94b0-4639-bb2d-e4152385c932" path="/var/lib/kubelet/pods/e81e2384-94b0-4639-bb2d-e4152385c932/volumes" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.480629 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp"] Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.542500 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7m29z"] Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.565345 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.584646 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-catalog-content\") pod \"redhat-operators-zgfcm\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.585612 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-utilities\") pod \"redhat-operators-zgfcm\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.586114 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-utilities\") pod \"redhat-operators-zgfcm\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.586166 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jctj4\" (UniqueName: \"kubernetes.io/projected/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-kube-api-access-jctj4\") pod \"redhat-operators-zgfcm\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.585913 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-catalog-content\") pod \"redhat-operators-zgfcm\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.609334 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jctj4\" (UniqueName: \"kubernetes.io/projected/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-kube-api-access-jctj4\") pod \"redhat-operators-zgfcm\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.717211 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.723346 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.732947 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:10 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:10 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:10 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.733030 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.761796 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vg8dq" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.768511 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cznml"] Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.769892 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.795380 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cznml"] Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.890235 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d994r\" (UniqueName: \"kubernetes.io/projected/b144748c-2940-4efe-a486-d2b5c1239b12-kube-api-access-d994r\") pod \"redhat-operators-cznml\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.890817 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-utilities\") pod \"redhat-operators-cznml\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.890924 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-catalog-content\") pod \"redhat-operators-cznml\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.900345 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43584: no serving certificate available for the kubelet" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.943054 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5b496f66-c844-4eed-b91d-7c0b6b796e5e","Type":"ContainerStarted","Data":"c6b72bb73f4a17d5dd38c827060714d11fba907dd60f71c72f7d82237314541f"} Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.943136 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5b496f66-c844-4eed-b91d-7c0b6b796e5e","Type":"ContainerStarted","Data":"273ef0a7f028ed23bc9f1614a57b25fb15734b341325ff5bacc9681a87caec3a"} Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.945586 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" event={"ID":"8805db9c-11be-498e-9f1f-7bc6914dba76","Type":"ContainerStarted","Data":"5001466b0a821fd088ffe8d3580f139c85ddf7eda1ca152fed0f401c41c672e0"} Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.945607 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" event={"ID":"8805db9c-11be-498e-9f1f-7bc6914dba76","Type":"ContainerStarted","Data":"6319a8a5c70f22df1c28604d34ec4101c8e3e996be0a10469bb57b8a38840242"} Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.946349 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.974249 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" event={"ID":"e24bacec-594f-429f-8e02-73abc6c4b092","Type":"ContainerStarted","Data":"155058dcb792bcca927dba12ed5317e9b707d06ff0167036853cffab66697b72"} Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.974316 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" event={"ID":"e24bacec-594f-429f-8e02-73abc6c4b092","Type":"ContainerStarted","Data":"788d6540d0d6e5edc7b3a0c3a17787a5f1728b835f1af19ef78772b9e9d77f08"} Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.974948 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.976591 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.976536899 podStartE2EDuration="1.976536899s" podCreationTimestamp="2026-03-21 04:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:10.974463257 +0000 UTC m=+235.302249943" watchObservedRunningTime="2026-03-21 04:27:10.976536899 +0000 UTC m=+235.304323575" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.988414 4839 generic.go:334] "Generic (PLEG): container finished" podID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerID="7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5" exitCode=0 Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.988525 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7m29z" event={"ID":"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50","Type":"ContainerDied","Data":"7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5"} Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.988562 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7m29z" event={"ID":"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50","Type":"ContainerStarted","Data":"ffa5ea5aab95eb6762df26e9e80adb2d4051bdda8b4098100f8d0af4408a8c40"} Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.993812 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-catalog-content\") pod \"redhat-operators-cznml\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.993884 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d994r\" (UniqueName: \"kubernetes.io/projected/b144748c-2940-4efe-a486-d2b5c1239b12-kube-api-access-d994r\") pod \"redhat-operators-cznml\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.994030 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-utilities\") pod \"redhat-operators-cznml\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.994840 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-utilities\") pod \"redhat-operators-cznml\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:10 crc kubenswrapper[4839]: I0321 04:27:10.995947 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-catalog-content\") pod \"redhat-operators-cznml\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:10.999493 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" podStartSLOduration=3.999471184 podStartE2EDuration="3.999471184s" podCreationTimestamp="2026-03-21 04:27:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:10.993160126 +0000 UTC m=+235.320946802" watchObservedRunningTime="2026-03-21 04:27:10.999471184 +0000 UTC m=+235.327257860" Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:10.999810 4839 generic.go:334] "Generic (PLEG): container finished" podID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerID="1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231" exitCode=0 Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:10.999885 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qjgq" event={"ID":"0b7a7313-21c4-4909-9ebe-ebe552b29b8c","Type":"ContainerDied","Data":"1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231"} Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:10.999915 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qjgq" event={"ID":"0b7a7313-21c4-4909-9ebe-ebe552b29b8c","Type":"ContainerStarted","Data":"6a5663fd0eb16a90e793ba0b93994b3affe90036f9e0e38ea8915b0da62b0425"} Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.021659 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" podStartSLOduration=4.021639187 podStartE2EDuration="4.021639187s" podCreationTimestamp="2026-03-21 04:27:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:11.019204665 +0000 UTC m=+235.346991351" watchObservedRunningTime="2026-03-21 04:27:11.021639187 +0000 UTC m=+235.349425863" Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.030028 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d994r\" (UniqueName: \"kubernetes.io/projected/b144748c-2940-4efe-a486-d2b5c1239b12-kube-api-access-d994r\") pod \"redhat-operators-cznml\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.589381 4839 patch_prober.go:28] interesting pod/controller-manager-5fbc589df6-8mjvg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.589442 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" podUID="e24bacec-594f-429f-8e02-73abc6c4b092" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.597949 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.612408 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e964573d-8ca6-4f88-8754-f34b3aa57504","Type":"ContainerStarted","Data":"9354fa601ffe81cec940ff50063028be74bdb3c1b3d994da34672057ed7bc082"} Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.612458 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e964573d-8ca6-4f88-8754-f34b3aa57504","Type":"ContainerStarted","Data":"a42e32fd6c837587208cca2bfb2a070aca572819f156271c0601f9badb5fbc1d"} Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.655086 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zgfcm"] Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.656342 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cstqb" event={"ID":"4fee5524-9cb1-48c7-83b6-10bf3230c783","Type":"ContainerStarted","Data":"42a43bb3aceec4702990fb0046882e1fb903fed10830c4fd8ed164857f556782"} Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.656381 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cstqb" event={"ID":"4fee5524-9cb1-48c7-83b6-10bf3230c783","Type":"ContainerStarted","Data":"a37cf9596f67207128283a14feff2f3773edd83d2130f0b3b042271179a72d24"} Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.652713 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.652687266 podStartE2EDuration="2.652687266s" podCreationTimestamp="2026-03-21 04:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:11.641023838 +0000 UTC m=+235.968810514" watchObservedRunningTime="2026-03-21 04:27:11.652687266 +0000 UTC m=+235.980473952" Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.690388 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ql2ps"] Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.695056 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-cstqb" podStartSLOduration=14.695032333 podStartE2EDuration="14.695032333s" podCreationTimestamp="2026-03-21 04:26:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:11.691622691 +0000 UTC m=+236.019409367" watchObservedRunningTime="2026-03-21 04:27:11.695032333 +0000 UTC m=+236.022819009" Mar 21 04:27:11 crc kubenswrapper[4839]: W0321 04:27:11.703219 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d1d0c02_87bf_4c8b_bc1c_d25007fb3c1c.slice/crio-45ca59ee6d68e70db13f642a35e227f7dae46d5a40341a7fcc4d0c33d12ae8bf WatchSource:0}: Error finding container 45ca59ee6d68e70db13f642a35e227f7dae46d5a40341a7fcc4d0c33d12ae8bf: Status 404 returned error can't find the container with id 45ca59ee6d68e70db13f642a35e227f7dae46d5a40341a7fcc4d0c33d12ae8bf Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.720939 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:11 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:11 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:11 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.721026 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:11 crc kubenswrapper[4839]: I0321 04:27:11.858932 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:12 crc kubenswrapper[4839]: I0321 04:27:12.164358 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cznml"] Mar 21 04:27:12 crc kubenswrapper[4839]: W0321 04:27:12.167950 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb144748c_2940_4efe_a486_d2b5c1239b12.slice/crio-9d0751bfec85855cd6ce730251b6d95d8b4de8c09e14303b2b6a9c1d9c1fd165 WatchSource:0}: Error finding container 9d0751bfec85855cd6ce730251b6d95d8b4de8c09e14303b2b6a9c1d9c1fd165: Status 404 returned error can't find the container with id 9d0751bfec85855cd6ce730251b6d95d8b4de8c09e14303b2b6a9c1d9c1fd165 Mar 21 04:27:12 crc kubenswrapper[4839]: I0321 04:27:12.669202 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgfcm" event={"ID":"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c","Type":"ContainerStarted","Data":"976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9"} Mar 21 04:27:12 crc kubenswrapper[4839]: I0321 04:27:12.669258 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgfcm" event={"ID":"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c","Type":"ContainerStarted","Data":"45ca59ee6d68e70db13f642a35e227f7dae46d5a40341a7fcc4d0c33d12ae8bf"} Mar 21 04:27:12 crc kubenswrapper[4839]: I0321 04:27:12.671210 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cznml" event={"ID":"b144748c-2940-4efe-a486-d2b5c1239b12","Type":"ContainerStarted","Data":"9d0751bfec85855cd6ce730251b6d95d8b4de8c09e14303b2b6a9c1d9c1fd165"} Mar 21 04:27:12 crc kubenswrapper[4839]: I0321 04:27:12.672971 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" event={"ID":"7ef3f28d-e496-434e-a803-3b9a0fa24690","Type":"ContainerStarted","Data":"db289ed2561962adc1edb7c7cc7d0a2aafe884fed424734dbdd27242d856949f"} Mar 21 04:27:12 crc kubenswrapper[4839]: I0321 04:27:12.675274 4839 generic.go:334] "Generic (PLEG): container finished" podID="e964573d-8ca6-4f88-8754-f34b3aa57504" containerID="9354fa601ffe81cec940ff50063028be74bdb3c1b3d994da34672057ed7bc082" exitCode=0 Mar 21 04:27:12 crc kubenswrapper[4839]: I0321 04:27:12.675427 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e964573d-8ca6-4f88-8754-f34b3aa57504","Type":"ContainerDied","Data":"9354fa601ffe81cec940ff50063028be74bdb3c1b3d994da34672057ed7bc082"} Mar 21 04:27:12 crc kubenswrapper[4839]: I0321 04:27:12.680851 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:12 crc kubenswrapper[4839]: I0321 04:27:12.720465 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:12 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:12 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:12 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:12 crc kubenswrapper[4839]: I0321 04:27:12.720554 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:13 crc kubenswrapper[4839]: I0321 04:27:13.684362 4839 generic.go:334] "Generic (PLEG): container finished" podID="5b496f66-c844-4eed-b91d-7c0b6b796e5e" containerID="c6b72bb73f4a17d5dd38c827060714d11fba907dd60f71c72f7d82237314541f" exitCode=0 Mar 21 04:27:13 crc kubenswrapper[4839]: I0321 04:27:13.684434 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5b496f66-c844-4eed-b91d-7c0b6b796e5e","Type":"ContainerDied","Data":"c6b72bb73f4a17d5dd38c827060714d11fba907dd60f71c72f7d82237314541f"} Mar 21 04:27:13 crc kubenswrapper[4839]: I0321 04:27:13.687865 4839 generic.go:334] "Generic (PLEG): container finished" podID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerID="976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9" exitCode=0 Mar 21 04:27:13 crc kubenswrapper[4839]: I0321 04:27:13.687950 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgfcm" event={"ID":"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c","Type":"ContainerDied","Data":"976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9"} Mar 21 04:27:13 crc kubenswrapper[4839]: I0321 04:27:13.689815 4839 generic.go:334] "Generic (PLEG): container finished" podID="b144748c-2940-4efe-a486-d2b5c1239b12" containerID="d34aaed9fe2b229d5a38e871187b560f6a9b3aa6a74029511701966c2b92c3d3" exitCode=0 Mar 21 04:27:13 crc kubenswrapper[4839]: I0321 04:27:13.689873 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cznml" event={"ID":"b144748c-2940-4efe-a486-d2b5c1239b12","Type":"ContainerDied","Data":"d34aaed9fe2b229d5a38e871187b560f6a9b3aa6a74029511701966c2b92c3d3"} Mar 21 04:27:13 crc kubenswrapper[4839]: I0321 04:27:13.693878 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" event={"ID":"7ef3f28d-e496-434e-a803-3b9a0fa24690","Type":"ContainerStarted","Data":"0b216795d8b50fc395a781f55afc6bd2e9902da0332fa52d6ee539b16a4c0446"} Mar 21 04:27:13 crc kubenswrapper[4839]: I0321 04:27:13.719502 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:13 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:13 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:13 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:13 crc kubenswrapper[4839]: I0321 04:27:13.719585 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:13 crc kubenswrapper[4839]: I0321 04:27:13.969354 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.000545 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e964573d-8ca6-4f88-8754-f34b3aa57504-kubelet-dir\") pod \"e964573d-8ca6-4f88-8754-f34b3aa57504\" (UID: \"e964573d-8ca6-4f88-8754-f34b3aa57504\") " Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.000733 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e964573d-8ca6-4f88-8754-f34b3aa57504-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e964573d-8ca6-4f88-8754-f34b3aa57504" (UID: "e964573d-8ca6-4f88-8754-f34b3aa57504"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.000848 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e964573d-8ca6-4f88-8754-f34b3aa57504-kube-api-access\") pod \"e964573d-8ca6-4f88-8754-f34b3aa57504\" (UID: \"e964573d-8ca6-4f88-8754-f34b3aa57504\") " Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.001103 4839 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e964573d-8ca6-4f88-8754-f34b3aa57504-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.012911 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e964573d-8ca6-4f88-8754-f34b3aa57504-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e964573d-8ca6-4f88-8754-f34b3aa57504" (UID: "e964573d-8ca6-4f88-8754-f34b3aa57504"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.103214 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e964573d-8ca6-4f88-8754-f34b3aa57504-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.115416 4839 ???:1] "http: TLS handshake error from 192.168.126.11:43590: no serving certificate available for the kubelet" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.702106 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e964573d-8ca6-4f88-8754-f34b3aa57504","Type":"ContainerDied","Data":"a42e32fd6c837587208cca2bfb2a070aca572819f156271c0601f9badb5fbc1d"} Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.702160 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a42e32fd6c837587208cca2bfb2a070aca572819f156271c0601f9badb5fbc1d" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.702287 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.702121 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.727207 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:14 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:14 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:14 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.727301 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.727697 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" podStartSLOduration=174.727682947 podStartE2EDuration="2m54.727682947s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:27:14.720907735 +0000 UTC m=+239.048694431" watchObservedRunningTime="2026-03-21 04:27:14.727682947 +0000 UTC m=+239.055469623" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.866019 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.870791 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-gl7rc" Mar 21 04:27:14 crc kubenswrapper[4839]: I0321 04:27:14.983775 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.140438 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kube-api-access\") pod \"5b496f66-c844-4eed-b91d-7c0b6b796e5e\" (UID: \"5b496f66-c844-4eed-b91d-7c0b6b796e5e\") " Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.140481 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kubelet-dir\") pod \"5b496f66-c844-4eed-b91d-7c0b6b796e5e\" (UID: \"5b496f66-c844-4eed-b91d-7c0b6b796e5e\") " Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.140694 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5b496f66-c844-4eed-b91d-7c0b6b796e5e" (UID: "5b496f66-c844-4eed-b91d-7c0b6b796e5e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.141149 4839 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.146450 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5b496f66-c844-4eed-b91d-7c0b6b796e5e" (UID: "5b496f66-c844-4eed-b91d-7c0b6b796e5e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.242687 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b496f66-c844-4eed-b91d-7c0b6b796e5e-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.710633 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5b496f66-c844-4eed-b91d-7c0b6b796e5e","Type":"ContainerDied","Data":"273ef0a7f028ed23bc9f1614a57b25fb15734b341325ff5bacc9681a87caec3a"} Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.710668 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="273ef0a7f028ed23bc9f1614a57b25fb15734b341325ff5bacc9681a87caec3a" Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.710709 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.720471 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:15 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:15 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:15 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.720554 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:15 crc kubenswrapper[4839]: I0321 04:27:15.773794 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5jhkc" Mar 21 04:27:16 crc kubenswrapper[4839]: I0321 04:27:16.132603 4839 ???:1] "http: TLS handshake error from 192.168.126.11:52940: no serving certificate available for the kubelet" Mar 21 04:27:16 crc kubenswrapper[4839]: I0321 04:27:16.723549 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:16 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:16 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:16 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:16 crc kubenswrapper[4839]: I0321 04:27:16.723631 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:17 crc kubenswrapper[4839]: I0321 04:27:17.719705 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:17 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:17 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:17 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:17 crc kubenswrapper[4839]: I0321 04:27:17.720308 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:18 crc kubenswrapper[4839]: I0321 04:27:18.720169 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:18 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:18 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:18 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:18 crc kubenswrapper[4839]: I0321 04:27:18.720234 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:19 crc kubenswrapper[4839]: I0321 04:27:19.722482 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:19 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:19 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:19 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:19 crc kubenswrapper[4839]: I0321 04:27:19.722541 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:19 crc kubenswrapper[4839]: I0321 04:27:19.931978 4839 patch_prober.go:28] interesting pod/downloads-7954f5f757-qp8mz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 21 04:27:19 crc kubenswrapper[4839]: I0321 04:27:19.932045 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qp8mz" podUID="4d63cdfd-21e7-4a63-960b-363fb131ac08" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 21 04:27:19 crc kubenswrapper[4839]: I0321 04:27:19.931979 4839 patch_prober.go:28] interesting pod/downloads-7954f5f757-qp8mz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Mar 21 04:27:19 crc kubenswrapper[4839]: I0321 04:27:19.932213 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qp8mz" podUID="4d63cdfd-21e7-4a63-960b-363fb131ac08" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Mar 21 04:27:20 crc kubenswrapper[4839]: I0321 04:27:20.247124 4839 patch_prober.go:28] interesting pod/console-f9d7485db-bj929 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 21 04:27:20 crc kubenswrapper[4839]: I0321 04:27:20.247184 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-bj929" podUID="ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 21 04:27:20 crc kubenswrapper[4839]: I0321 04:27:20.720215 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:20 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:20 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:20 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:20 crc kubenswrapper[4839]: I0321 04:27:20.720283 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:21 crc kubenswrapper[4839]: I0321 04:27:21.227996 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:27:21 crc kubenswrapper[4839]: I0321 04:27:21.230115 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 21 04:27:21 crc kubenswrapper[4839]: I0321 04:27:21.243943 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fa13ce27-53f2-4178-8560-251f0bb3f034-metrics-certs\") pod \"network-metrics-daemon-445ww\" (UID: \"fa13ce27-53f2-4178-8560-251f0bb3f034\") " pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:27:21 crc kubenswrapper[4839]: I0321 04:27:21.385325 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 21 04:27:21 crc kubenswrapper[4839]: I0321 04:27:21.393256 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-445ww" Mar 21 04:27:21 crc kubenswrapper[4839]: I0321 04:27:21.720340 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:21 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:21 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:21 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:21 crc kubenswrapper[4839]: I0321 04:27:21.720694 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:22 crc kubenswrapper[4839]: I0321 04:27:22.720015 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:22 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:22 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:22 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:22 crc kubenswrapper[4839]: I0321 04:27:22.720115 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:23 crc kubenswrapper[4839]: I0321 04:27:23.720537 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:23 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:23 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:23 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:23 crc kubenswrapper[4839]: I0321 04:27:23.720789 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:24 crc kubenswrapper[4839]: I0321 04:27:24.720810 4839 patch_prober.go:28] interesting pod/router-default-5444994796-w6dzs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 21 04:27:24 crc kubenswrapper[4839]: [-]has-synced failed: reason withheld Mar 21 04:27:24 crc kubenswrapper[4839]: [+]process-running ok Mar 21 04:27:24 crc kubenswrapper[4839]: healthz check failed Mar 21 04:27:24 crc kubenswrapper[4839]: I0321 04:27:24.721105 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w6dzs" podUID="28ce563b-8e5b-4abe-b71b-02c588bff511" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:27:25 crc kubenswrapper[4839]: I0321 04:27:25.720243 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:25 crc kubenswrapper[4839]: I0321 04:27:25.722073 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-w6dzs" Mar 21 04:27:26 crc kubenswrapper[4839]: I0321 04:27:26.682034 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5fbc589df6-8mjvg"] Mar 21 04:27:26 crc kubenswrapper[4839]: I0321 04:27:26.682261 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" podUID="e24bacec-594f-429f-8e02-73abc6c4b092" containerName="controller-manager" containerID="cri-o://155058dcb792bcca927dba12ed5317e9b707d06ff0167036853cffab66697b72" gracePeriod=30 Mar 21 04:27:26 crc kubenswrapper[4839]: I0321 04:27:26.709922 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp"] Mar 21 04:27:26 crc kubenswrapper[4839]: I0321 04:27:26.710403 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" podUID="8805db9c-11be-498e-9f1f-7bc6914dba76" containerName="route-controller-manager" containerID="cri-o://5001466b0a821fd088ffe8d3580f139c85ddf7eda1ca152fed0f401c41c672e0" gracePeriod=30 Mar 21 04:27:27 crc kubenswrapper[4839]: I0321 04:27:27.847045 4839 generic.go:334] "Generic (PLEG): container finished" podID="e24bacec-594f-429f-8e02-73abc6c4b092" containerID="155058dcb792bcca927dba12ed5317e9b707d06ff0167036853cffab66697b72" exitCode=0 Mar 21 04:27:27 crc kubenswrapper[4839]: I0321 04:27:27.847133 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" event={"ID":"e24bacec-594f-429f-8e02-73abc6c4b092","Type":"ContainerDied","Data":"155058dcb792bcca927dba12ed5317e9b707d06ff0167036853cffab66697b72"} Mar 21 04:27:27 crc kubenswrapper[4839]: I0321 04:27:27.849785 4839 generic.go:334] "Generic (PLEG): container finished" podID="8805db9c-11be-498e-9f1f-7bc6914dba76" containerID="5001466b0a821fd088ffe8d3580f139c85ddf7eda1ca152fed0f401c41c672e0" exitCode=0 Mar 21 04:27:27 crc kubenswrapper[4839]: I0321 04:27:27.849810 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" event={"ID":"8805db9c-11be-498e-9f1f-7bc6914dba76","Type":"ContainerDied","Data":"5001466b0a821fd088ffe8d3580f139c85ddf7eda1ca152fed0f401c41c672e0"} Mar 21 04:27:29 crc kubenswrapper[4839]: I0321 04:27:29.936184 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-qp8mz" Mar 21 04:27:30 crc kubenswrapper[4839]: I0321 04:27:30.043799 4839 patch_prober.go:28] interesting pod/controller-manager-5fbc589df6-8mjvg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 21 04:27:30 crc kubenswrapper[4839]: I0321 04:27:30.043863 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" podUID="e24bacec-594f-429f-8e02-73abc6c4b092" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 21 04:27:30 crc kubenswrapper[4839]: I0321 04:27:30.073249 4839 patch_prober.go:28] interesting pod/route-controller-manager-6bbc66d757-nhjsp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 21 04:27:30 crc kubenswrapper[4839]: I0321 04:27:30.073308 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" podUID="8805db9c-11be-498e-9f1f-7bc6914dba76" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 21 04:27:30 crc kubenswrapper[4839]: I0321 04:27:30.350446 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:27:30 crc kubenswrapper[4839]: I0321 04:27:30.354839 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:27:30 crc kubenswrapper[4839]: I0321 04:27:30.570523 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:27:30 crc kubenswrapper[4839]: I0321 04:27:30.980723 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:27:30 crc kubenswrapper[4839]: I0321 04:27:30.980791 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:27:36 crc kubenswrapper[4839]: I0321 04:27:36.642173 4839 ???:1] "http: TLS handshake error from 192.168.126.11:39288: no serving certificate available for the kubelet" Mar 21 04:27:39 crc kubenswrapper[4839]: E0321 04:27:39.508101 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 21 04:27:39 crc kubenswrapper[4839]: E0321 04:27:39.508535 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nt4t7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-mxrc8_openshift-marketplace(6513c45b-dd98-40b0-b69c-94db4d1c916e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:27:39 crc kubenswrapper[4839]: E0321 04:27:39.509701 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-mxrc8" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" Mar 21 04:27:40 crc kubenswrapper[4839]: I0321 04:27:40.137449 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bxg8h" Mar 21 04:27:41 crc kubenswrapper[4839]: I0321 04:27:41.043327 4839 patch_prober.go:28] interesting pod/controller-manager-5fbc589df6-8mjvg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:27:41 crc kubenswrapper[4839]: I0321 04:27:41.043415 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" podUID="e24bacec-594f-429f-8e02-73abc6c4b092" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:27:41 crc kubenswrapper[4839]: I0321 04:27:41.071108 4839 patch_prober.go:28] interesting pod/route-controller-manager-6bbc66d757-nhjsp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:27:41 crc kubenswrapper[4839]: I0321 04:27:41.071224 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" podUID="8805db9c-11be-498e-9f1f-7bc6914dba76" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:27:42 crc kubenswrapper[4839]: E0321 04:27:42.086558 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-mxrc8" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.681791 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 21 04:27:42 crc kubenswrapper[4839]: E0321 04:27:42.682216 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e964573d-8ca6-4f88-8754-f34b3aa57504" containerName="pruner" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.682259 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e964573d-8ca6-4f88-8754-f34b3aa57504" containerName="pruner" Mar 21 04:27:42 crc kubenswrapper[4839]: E0321 04:27:42.682297 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b496f66-c844-4eed-b91d-7c0b6b796e5e" containerName="pruner" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.682314 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b496f66-c844-4eed-b91d-7c0b6b796e5e" containerName="pruner" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.682665 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b496f66-c844-4eed-b91d-7c0b6b796e5e" containerName="pruner" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.682714 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e964573d-8ca6-4f88-8754-f34b3aa57504" containerName="pruner" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.683508 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.685964 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.686194 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.691333 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.866947 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.867005 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.968332 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.968380 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:27:42 crc kubenswrapper[4839]: I0321 04:27:42.968479 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:27:43 crc kubenswrapper[4839]: I0321 04:27:43.002751 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:27:43 crc kubenswrapper[4839]: I0321 04:27:43.015422 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.068797 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.071208 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.079237 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.131028 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.131092 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-var-lock\") pod \"installer-9-crc\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.131385 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kube-api-access\") pod \"installer-9-crc\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.232699 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.232780 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-var-lock\") pod \"installer-9-crc\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.232862 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.232912 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kube-api-access\") pod \"installer-9-crc\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.232951 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-var-lock\") pod \"installer-9-crc\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.250130 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kube-api-access\") pod \"installer-9-crc\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:48 crc kubenswrapper[4839]: I0321 04:27:48.399211 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:27:51 crc kubenswrapper[4839]: I0321 04:27:51.043993 4839 patch_prober.go:28] interesting pod/controller-manager-5fbc589df6-8mjvg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:27:51 crc kubenswrapper[4839]: I0321 04:27:51.044379 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" podUID="e24bacec-594f-429f-8e02-73abc6c4b092" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:27:51 crc kubenswrapper[4839]: I0321 04:27:51.071117 4839 patch_prober.go:28] interesting pod/route-controller-manager-6bbc66d757-nhjsp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 21 04:27:51 crc kubenswrapper[4839]: I0321 04:27:51.071188 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" podUID="8805db9c-11be-498e-9f1f-7bc6914dba76" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 21 04:27:51 crc kubenswrapper[4839]: E0321 04:27:51.077900 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 21 04:27:51 crc kubenswrapper[4839]: E0321 04:27:51.078058 4839 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 21 04:27:51 crc kubenswrapper[4839]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 21 04:27:51 crc kubenswrapper[4839]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vjkdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29567786-d8w8k_openshift-infra(609ace61-45d1-44f6-b378-fb97eecf2374): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 21 04:27:51 crc kubenswrapper[4839]: > logger="UnhandledError" Mar 21 04:27:51 crc kubenswrapper[4839]: E0321 04:27:51.080058 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" podUID="609ace61-45d1-44f6-b378-fb97eecf2374" Mar 21 04:27:51 crc kubenswrapper[4839]: E0321 04:27:51.996142 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" podUID="609ace61-45d1-44f6-b378-fb97eecf2374" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.737062 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.741977 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.771263 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64bdf84bc9-fblcp"] Mar 21 04:27:52 crc kubenswrapper[4839]: E0321 04:27:52.771556 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e24bacec-594f-429f-8e02-73abc6c4b092" containerName="controller-manager" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.771583 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24bacec-594f-429f-8e02-73abc6c4b092" containerName="controller-manager" Mar 21 04:27:52 crc kubenswrapper[4839]: E0321 04:27:52.771593 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8805db9c-11be-498e-9f1f-7bc6914dba76" containerName="route-controller-manager" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.771599 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8805db9c-11be-498e-9f1f-7bc6914dba76" containerName="route-controller-manager" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.771725 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8805db9c-11be-498e-9f1f-7bc6914dba76" containerName="route-controller-manager" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.771738 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e24bacec-594f-429f-8e02-73abc6c4b092" containerName="controller-manager" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.772182 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.775009 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64bdf84bc9-fblcp"] Mar 21 04:27:52 crc kubenswrapper[4839]: E0321 04:27:52.888029 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 21 04:27:52 crc kubenswrapper[4839]: E0321 04:27:52.888238 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z7dr2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qrqj2_openshift-marketplace(f1ec80e5-557b-4c30-8323-87d6b1447a6d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:27:52 crc kubenswrapper[4839]: E0321 04:27:52.890667 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-qrqj2" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906087 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhfng\" (UniqueName: \"kubernetes.io/projected/8805db9c-11be-498e-9f1f-7bc6914dba76-kube-api-access-vhfng\") pod \"8805db9c-11be-498e-9f1f-7bc6914dba76\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906136 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-config\") pod \"8805db9c-11be-498e-9f1f-7bc6914dba76\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906181 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e24bacec-594f-429f-8e02-73abc6c4b092-serving-cert\") pod \"e24bacec-594f-429f-8e02-73abc6c4b092\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906217 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8805db9c-11be-498e-9f1f-7bc6914dba76-serving-cert\") pod \"8805db9c-11be-498e-9f1f-7bc6914dba76\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906252 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-client-ca\") pod \"e24bacec-594f-429f-8e02-73abc6c4b092\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906281 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5hv7\" (UniqueName: \"kubernetes.io/projected/e24bacec-594f-429f-8e02-73abc6c4b092-kube-api-access-s5hv7\") pod \"e24bacec-594f-429f-8e02-73abc6c4b092\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906315 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-client-ca\") pod \"8805db9c-11be-498e-9f1f-7bc6914dba76\" (UID: \"8805db9c-11be-498e-9f1f-7bc6914dba76\") " Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906357 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-config\") pod \"e24bacec-594f-429f-8e02-73abc6c4b092\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906374 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-proxy-ca-bundles\") pod \"e24bacec-594f-429f-8e02-73abc6c4b092\" (UID: \"e24bacec-594f-429f-8e02-73abc6c4b092\") " Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906593 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-client-ca\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906639 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-proxy-ca-bundles\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906668 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-config\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906700 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s54qc\" (UniqueName: \"kubernetes.io/projected/602cd797-e549-4e26-a152-a1cb4decf82d-kube-api-access-s54qc\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.906722 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/602cd797-e549-4e26-a152-a1cb4decf82d-serving-cert\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.907251 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e24bacec-594f-429f-8e02-73abc6c4b092" (UID: "e24bacec-594f-429f-8e02-73abc6c4b092"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.907289 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-config" (OuterVolumeSpecName: "config") pod "e24bacec-594f-429f-8e02-73abc6c4b092" (UID: "e24bacec-594f-429f-8e02-73abc6c4b092"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.907398 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-client-ca" (OuterVolumeSpecName: "client-ca") pod "e24bacec-594f-429f-8e02-73abc6c4b092" (UID: "e24bacec-594f-429f-8e02-73abc6c4b092"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.907598 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-client-ca" (OuterVolumeSpecName: "client-ca") pod "8805db9c-11be-498e-9f1f-7bc6914dba76" (UID: "8805db9c-11be-498e-9f1f-7bc6914dba76"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.908347 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-config" (OuterVolumeSpecName: "config") pod "8805db9c-11be-498e-9f1f-7bc6914dba76" (UID: "8805db9c-11be-498e-9f1f-7bc6914dba76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.910732 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e24bacec-594f-429f-8e02-73abc6c4b092-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e24bacec-594f-429f-8e02-73abc6c4b092" (UID: "e24bacec-594f-429f-8e02-73abc6c4b092"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.910802 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8805db9c-11be-498e-9f1f-7bc6914dba76-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8805db9c-11be-498e-9f1f-7bc6914dba76" (UID: "8805db9c-11be-498e-9f1f-7bc6914dba76"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.910933 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8805db9c-11be-498e-9f1f-7bc6914dba76-kube-api-access-vhfng" (OuterVolumeSpecName: "kube-api-access-vhfng") pod "8805db9c-11be-498e-9f1f-7bc6914dba76" (UID: "8805db9c-11be-498e-9f1f-7bc6914dba76"). InnerVolumeSpecName "kube-api-access-vhfng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.911882 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e24bacec-594f-429f-8e02-73abc6c4b092-kube-api-access-s5hv7" (OuterVolumeSpecName: "kube-api-access-s5hv7") pod "e24bacec-594f-429f-8e02-73abc6c4b092" (UID: "e24bacec-594f-429f-8e02-73abc6c4b092"). InnerVolumeSpecName "kube-api-access-s5hv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.999774 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" event={"ID":"8805db9c-11be-498e-9f1f-7bc6914dba76","Type":"ContainerDied","Data":"6319a8a5c70f22df1c28604d34ec4101c8e3e996be0a10469bb57b8a38840242"} Mar 21 04:27:52 crc kubenswrapper[4839]: I0321 04:27:52.999826 4839 scope.go:117] "RemoveContainer" containerID="5001466b0a821fd088ffe8d3580f139c85ddf7eda1ca152fed0f401c41c672e0" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:52.999923 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.002734 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" event={"ID":"e24bacec-594f-429f-8e02-73abc6c4b092","Type":"ContainerDied","Data":"788d6540d0d6e5edc7b3a0c3a17787a5f1728b835f1af19ef78772b9e9d77f08"} Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.003659 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fbc589df6-8mjvg" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007697 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s54qc\" (UniqueName: \"kubernetes.io/projected/602cd797-e549-4e26-a152-a1cb4decf82d-kube-api-access-s54qc\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007739 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/602cd797-e549-4e26-a152-a1cb4decf82d-serving-cert\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007838 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-client-ca\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007865 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-proxy-ca-bundles\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007890 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-config\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007937 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhfng\" (UniqueName: \"kubernetes.io/projected/8805db9c-11be-498e-9f1f-7bc6914dba76-kube-api-access-vhfng\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007952 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007963 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e24bacec-594f-429f-8e02-73abc6c4b092-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007975 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8805db9c-11be-498e-9f1f-7bc6914dba76-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007987 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.007997 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5hv7\" (UniqueName: \"kubernetes.io/projected/e24bacec-594f-429f-8e02-73abc6c4b092-kube-api-access-s5hv7\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.008007 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8805db9c-11be-498e-9f1f-7bc6914dba76-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.008018 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.008027 4839 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e24bacec-594f-429f-8e02-73abc6c4b092-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.009017 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-client-ca\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.009367 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-config\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.009460 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-proxy-ca-bundles\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.012128 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/602cd797-e549-4e26-a152-a1cb4decf82d-serving-cert\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.027026 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s54qc\" (UniqueName: \"kubernetes.io/projected/602cd797-e549-4e26-a152-a1cb4decf82d-kube-api-access-s54qc\") pod \"controller-manager-64bdf84bc9-fblcp\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.050255 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp"] Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.057183 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bbc66d757-nhjsp"] Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.060707 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5fbc589df6-8mjvg"] Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.064216 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5fbc589df6-8mjvg"] Mar 21 04:27:53 crc kubenswrapper[4839]: I0321 04:27:53.094438 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:27:54 crc kubenswrapper[4839]: I0321 04:27:54.466914 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8805db9c-11be-498e-9f1f-7bc6914dba76" path="/var/lib/kubelet/pods/8805db9c-11be-498e-9f1f-7bc6914dba76/volumes" Mar 21 04:27:54 crc kubenswrapper[4839]: I0321 04:27:54.468168 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e24bacec-594f-429f-8e02-73abc6c4b092" path="/var/lib/kubelet/pods/e24bacec-594f-429f-8e02-73abc6c4b092/volumes" Mar 21 04:27:55 crc kubenswrapper[4839]: E0321 04:27:55.318915 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 21 04:27:55 crc kubenswrapper[4839]: E0321 04:27:55.319071 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w9sw7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-v4btp_openshift-marketplace(dc99f39a-8001-466b-acf1-bd106eb2b81d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:27:55 crc kubenswrapper[4839]: E0321 04:27:55.320230 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-v4btp" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.717755 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7"] Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.720781 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.722637 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.723078 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.723132 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.723663 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7"] Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.724905 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.724914 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.725134 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.740374 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-client-ca\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.740410 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4842010-e137-466d-9596-e65f0cf2f4da-serving-cert\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.740448 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-config\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.740558 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9x2w\" (UniqueName: \"kubernetes.io/projected/c4842010-e137-466d-9596-e65f0cf2f4da-kube-api-access-k9x2w\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.841344 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-client-ca\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.841400 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4842010-e137-466d-9596-e65f0cf2f4da-serving-cert\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.841449 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-config\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.841552 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9x2w\" (UniqueName: \"kubernetes.io/projected/c4842010-e137-466d-9596-e65f0cf2f4da-kube-api-access-k9x2w\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.842833 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-client-ca\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.843635 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-config\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.855672 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4842010-e137-466d-9596-e65f0cf2f4da-serving-cert\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:55 crc kubenswrapper[4839]: I0321 04:27:55.869248 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9x2w\" (UniqueName: \"kubernetes.io/projected/c4842010-e137-466d-9596-e65f0cf2f4da-kube-api-access-k9x2w\") pod \"route-controller-manager-bbccd6fb-qndh7\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:58 crc kubenswrapper[4839]: I0321 04:27:56.036536 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:27:58 crc kubenswrapper[4839]: E0321 04:27:57.831334 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qrqj2" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" Mar 21 04:27:58 crc kubenswrapper[4839]: E0321 04:27:57.831365 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-v4btp" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" Mar 21 04:27:59 crc kubenswrapper[4839]: E0321 04:27:59.474453 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 21 04:27:59 crc kubenswrapper[4839]: E0321 04:27:59.474647 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ncnmc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-9qjgq_openshift-marketplace(0b7a7313-21c4-4909-9ebe-ebe552b29b8c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:27:59 crc kubenswrapper[4839]: E0321 04:27:59.475898 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-9qjgq" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" Mar 21 04:27:59 crc kubenswrapper[4839]: E0321 04:27:59.696340 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 21 04:27:59 crc kubenswrapper[4839]: E0321 04:27:59.696934 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-57m7l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-7m29z_openshift-marketplace(c3ae9e7a-784b-4a39-bd4e-10dbff65cd50): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:27:59 crc kubenswrapper[4839]: E0321 04:27:59.698106 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-7m29z" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" Mar 21 04:27:59 crc kubenswrapper[4839]: E0321 04:27:59.849943 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 21 04:27:59 crc kubenswrapper[4839]: E0321 04:27:59.850134 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-krww4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-nw7r6_openshift-marketplace(65a571df-f531-458b-9aed-6de99e4607e1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:27:59 crc kubenswrapper[4839]: E0321 04:27:59.851558 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-nw7r6" podUID="65a571df-f531-458b-9aed-6de99e4607e1" Mar 21 04:28:00 crc kubenswrapper[4839]: I0321 04:28:00.126882 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567788-9snlp"] Mar 21 04:28:00 crc kubenswrapper[4839]: I0321 04:28:00.128186 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567788-9snlp" Mar 21 04:28:00 crc kubenswrapper[4839]: I0321 04:28:00.130830 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:28:00 crc kubenswrapper[4839]: I0321 04:28:00.134331 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567788-9snlp"] Mar 21 04:28:00 crc kubenswrapper[4839]: I0321 04:28:00.190008 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f67r4\" (UniqueName: \"kubernetes.io/projected/a45deb0c-4247-4d23-86db-a897c7f7e7f2-kube-api-access-f67r4\") pod \"auto-csr-approver-29567788-9snlp\" (UID: \"a45deb0c-4247-4d23-86db-a897c7f7e7f2\") " pod="openshift-infra/auto-csr-approver-29567788-9snlp" Mar 21 04:28:00 crc kubenswrapper[4839]: I0321 04:28:00.291558 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f67r4\" (UniqueName: \"kubernetes.io/projected/a45deb0c-4247-4d23-86db-a897c7f7e7f2-kube-api-access-f67r4\") pod \"auto-csr-approver-29567788-9snlp\" (UID: \"a45deb0c-4247-4d23-86db-a897c7f7e7f2\") " pod="openshift-infra/auto-csr-approver-29567788-9snlp" Mar 21 04:28:00 crc kubenswrapper[4839]: I0321 04:28:00.308811 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f67r4\" (UniqueName: \"kubernetes.io/projected/a45deb0c-4247-4d23-86db-a897c7f7e7f2-kube-api-access-f67r4\") pod \"auto-csr-approver-29567788-9snlp\" (UID: \"a45deb0c-4247-4d23-86db-a897c7f7e7f2\") " pod="openshift-infra/auto-csr-approver-29567788-9snlp" Mar 21 04:28:00 crc kubenswrapper[4839]: I0321 04:28:00.447451 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567788-9snlp" Mar 21 04:28:00 crc kubenswrapper[4839]: I0321 04:28:00.980984 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:28:00 crc kubenswrapper[4839]: I0321 04:28:00.981263 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:28:02 crc kubenswrapper[4839]: E0321 04:28:02.587988 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-7m29z" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" Mar 21 04:28:02 crc kubenswrapper[4839]: E0321 04:28:02.588123 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-9qjgq" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" Mar 21 04:28:02 crc kubenswrapper[4839]: E0321 04:28:02.588280 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-nw7r6" podUID="65a571df-f531-458b-9aed-6de99e4607e1" Mar 21 04:28:02 crc kubenswrapper[4839]: E0321 04:28:02.615343 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 21 04:28:02 crc kubenswrapper[4839]: E0321 04:28:02.615634 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jctj4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zgfcm_openshift-marketplace(5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:28:02 crc kubenswrapper[4839]: E0321 04:28:02.617144 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zgfcm" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" Mar 21 04:28:02 crc kubenswrapper[4839]: I0321 04:28:02.624991 4839 scope.go:117] "RemoveContainer" containerID="155058dcb792bcca927dba12ed5317e9b707d06ff0167036853cffab66697b72" Mar 21 04:28:02 crc kubenswrapper[4839]: E0321 04:28:02.626386 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 21 04:28:02 crc kubenswrapper[4839]: E0321 04:28:02.626546 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d994r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cznml_openshift-marketplace(b144748c-2940-4efe-a486-d2b5c1239b12): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:28:02 crc kubenswrapper[4839]: E0321 04:28:02.628316 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-cznml" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" Mar 21 04:28:03 crc kubenswrapper[4839]: I0321 04:28:03.076015 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 21 04:28:03 crc kubenswrapper[4839]: I0321 04:28:03.079245 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64bdf84bc9-fblcp"] Mar 21 04:28:03 crc kubenswrapper[4839]: I0321 04:28:03.094132 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-445ww"] Mar 21 04:28:03 crc kubenswrapper[4839]: I0321 04:28:03.161952 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567788-9snlp"] Mar 21 04:28:03 crc kubenswrapper[4839]: I0321 04:28:03.174471 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7"] Mar 21 04:28:03 crc kubenswrapper[4839]: I0321 04:28:03.178672 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 21 04:28:03 crc kubenswrapper[4839]: W0321 04:28:03.562764 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd62030f0_7d6f_46f7_83a2_a28fafe1a4ef.slice/crio-068c125d2ab1d568fb725b9241f548239afcbec04a0fa6f6362a280d379a40c3 WatchSource:0}: Error finding container 068c125d2ab1d568fb725b9241f548239afcbec04a0fa6f6362a280d379a40c3: Status 404 returned error can't find the container with id 068c125d2ab1d568fb725b9241f548239afcbec04a0fa6f6362a280d379a40c3 Mar 21 04:28:03 crc kubenswrapper[4839]: W0321 04:28:03.564540 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod602cd797_e549_4e26_a152_a1cb4decf82d.slice/crio-37880a6725bfc4b97b6547a22128a884c967634f4fc94acf8976fa9f68b2ab94 WatchSource:0}: Error finding container 37880a6725bfc4b97b6547a22128a884c967634f4fc94acf8976fa9f68b2ab94: Status 404 returned error can't find the container with id 37880a6725bfc4b97b6547a22128a884c967634f4fc94acf8976fa9f68b2ab94 Mar 21 04:28:03 crc kubenswrapper[4839]: W0321 04:28:03.565275 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa13ce27_53f2_4178_8560_251f0bb3f034.slice/crio-4f340f0cf8acc72f2c79da7cce1f15dc42b5eaee90faa48320ff32261b98e874 WatchSource:0}: Error finding container 4f340f0cf8acc72f2c79da7cce1f15dc42b5eaee90faa48320ff32261b98e874: Status 404 returned error can't find the container with id 4f340f0cf8acc72f2c79da7cce1f15dc42b5eaee90faa48320ff32261b98e874 Mar 21 04:28:03 crc kubenswrapper[4839]: W0321 04:28:03.567333 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda45deb0c_4247_4d23_86db_a897c7f7e7f2.slice/crio-7b7b577e4ba61820653dda6ae5bcf120e30feefac0d8c37751d6044474181784 WatchSource:0}: Error finding container 7b7b577e4ba61820653dda6ae5bcf120e30feefac0d8c37751d6044474181784: Status 404 returned error can't find the container with id 7b7b577e4ba61820653dda6ae5bcf120e30feefac0d8c37751d6044474181784 Mar 21 04:28:03 crc kubenswrapper[4839]: W0321 04:28:03.569197 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4842010_e137_466d_9596_e65f0cf2f4da.slice/crio-8c3849bbe60c3ca3e53c5a7956c8fe11b773acab42338cd4566762648f52546e WatchSource:0}: Error finding container 8c3849bbe60c3ca3e53c5a7956c8fe11b773acab42338cd4566762648f52546e: Status 404 returned error can't find the container with id 8c3849bbe60c3ca3e53c5a7956c8fe11b773acab42338cd4566762648f52546e Mar 21 04:28:03 crc kubenswrapper[4839]: E0321 04:28:03.569529 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zgfcm" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" Mar 21 04:28:03 crc kubenswrapper[4839]: W0321 04:28:03.570519 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode3a71e7a_3ead_483f_8de2_9dbf3a336182.slice/crio-c7bf0d3551fb41779d1f34afa1d26b3709109fa26382cddc767b170e4665d502 WatchSource:0}: Error finding container c7bf0d3551fb41779d1f34afa1d26b3709109fa26382cddc767b170e4665d502: Status 404 returned error can't find the container with id c7bf0d3551fb41779d1f34afa1d26b3709109fa26382cddc767b170e4665d502 Mar 21 04:28:04 crc kubenswrapper[4839]: I0321 04:28:04.056302 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" event={"ID":"c4842010-e137-466d-9596-e65f0cf2f4da","Type":"ContainerStarted","Data":"8c3849bbe60c3ca3e53c5a7956c8fe11b773acab42338cd4566762648f52546e"} Mar 21 04:28:04 crc kubenswrapper[4839]: I0321 04:28:04.058151 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-445ww" event={"ID":"fa13ce27-53f2-4178-8560-251f0bb3f034","Type":"ContainerStarted","Data":"4f340f0cf8acc72f2c79da7cce1f15dc42b5eaee90faa48320ff32261b98e874"} Mar 21 04:28:04 crc kubenswrapper[4839]: I0321 04:28:04.059048 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e3a71e7a-3ead-483f-8de2-9dbf3a336182","Type":"ContainerStarted","Data":"c7bf0d3551fb41779d1f34afa1d26b3709109fa26382cddc767b170e4665d502"} Mar 21 04:28:04 crc kubenswrapper[4839]: I0321 04:28:04.059980 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567788-9snlp" event={"ID":"a45deb0c-4247-4d23-86db-a897c7f7e7f2","Type":"ContainerStarted","Data":"7b7b577e4ba61820653dda6ae5bcf120e30feefac0d8c37751d6044474181784"} Mar 21 04:28:04 crc kubenswrapper[4839]: I0321 04:28:04.061845 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" event={"ID":"602cd797-e549-4e26-a152-a1cb4decf82d","Type":"ContainerStarted","Data":"37880a6725bfc4b97b6547a22128a884c967634f4fc94acf8976fa9f68b2ab94"} Mar 21 04:28:04 crc kubenswrapper[4839]: I0321 04:28:04.063009 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef","Type":"ContainerStarted","Data":"068c125d2ab1d568fb725b9241f548239afcbec04a0fa6f6362a280d379a40c3"} Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.071374 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e3a71e7a-3ead-483f-8de2-9dbf3a336182","Type":"ContainerStarted","Data":"c850247b91b749f4d993a8c6034f93518caa14ae16f4055edfe77ec5dbf0002f"} Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.073331 4839 generic.go:334] "Generic (PLEG): container finished" podID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerID="8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730" exitCode=0 Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.073407 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxrc8" event={"ID":"6513c45b-dd98-40b0-b69c-94db4d1c916e","Type":"ContainerDied","Data":"8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730"} Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.075518 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-445ww" event={"ID":"fa13ce27-53f2-4178-8560-251f0bb3f034","Type":"ContainerStarted","Data":"d69c82792ad201cbdc2bc4454792efe9e2d4e64a634f0d00512c8c3edde44b9c"} Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.075546 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-445ww" event={"ID":"fa13ce27-53f2-4178-8560-251f0bb3f034","Type":"ContainerStarted","Data":"ceafc028fef67080a2b5efff83e1dd8de71061972f2082c19d9e75b580a7dcc7"} Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.078818 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567788-9snlp" event={"ID":"a45deb0c-4247-4d23-86db-a897c7f7e7f2","Type":"ContainerStarted","Data":"4d013e774070ce075bd0baa030b45d638ec14fab41990f0c671aa0d311846927"} Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.082481 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" event={"ID":"602cd797-e549-4e26-a152-a1cb4decf82d","Type":"ContainerStarted","Data":"daace327abb9a5740d790334ff3eab34c10f48214de659b79cea981c463ae614"} Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.082712 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.085982 4839 generic.go:334] "Generic (PLEG): container finished" podID="d62030f0-7d6f-46f7-83a2-a28fafe1a4ef" containerID="e8a9f98548a454114e260e93906bc2ca769f62dc7de5ac4bc5fa88c6f2fff894" exitCode=0 Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.086074 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef","Type":"ContainerDied","Data":"e8a9f98548a454114e260e93906bc2ca769f62dc7de5ac4bc5fa88c6f2fff894"} Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.087805 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" event={"ID":"c4842010-e137-466d-9596-e65f0cf2f4da","Type":"ContainerStarted","Data":"5caf7db86d82198e233c9af5a90d5b97d038e01ec61ad89ee60361b87114a642"} Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.088056 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.089890 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.095467 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=17.09543494 podStartE2EDuration="17.09543494s" podCreationTimestamp="2026-03-21 04:27:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:28:05.090351243 +0000 UTC m=+289.418137939" watchObservedRunningTime="2026-03-21 04:28:05.09543494 +0000 UTC m=+289.423221666" Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.097660 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.114383 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567788-9snlp" podStartSLOduration=4.108270692 podStartE2EDuration="5.114355877s" podCreationTimestamp="2026-03-21 04:28:00 +0000 UTC" firstStartedPulling="2026-03-21 04:28:03.706165513 +0000 UTC m=+288.033952219" lastFinishedPulling="2026-03-21 04:28:04.712250728 +0000 UTC m=+289.040037404" observedRunningTime="2026-03-21 04:28:05.108232503 +0000 UTC m=+289.436019179" watchObservedRunningTime="2026-03-21 04:28:05.114355877 +0000 UTC m=+289.442142553" Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.144815 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-445ww" podStartSLOduration=225.144779044 podStartE2EDuration="3m45.144779044s" podCreationTimestamp="2026-03-21 04:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:28:05.142921404 +0000 UTC m=+289.470708090" watchObservedRunningTime="2026-03-21 04:28:05.144779044 +0000 UTC m=+289.472565720" Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.182324 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" podStartSLOduration=19.182301269 podStartE2EDuration="19.182301269s" podCreationTimestamp="2026-03-21 04:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:28:05.177323776 +0000 UTC m=+289.505110462" watchObservedRunningTime="2026-03-21 04:28:05.182301269 +0000 UTC m=+289.510087945" Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.199336 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" podStartSLOduration=19.199321506 podStartE2EDuration="19.199321506s" podCreationTimestamp="2026-03-21 04:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:28:05.198693139 +0000 UTC m=+289.526479825" watchObservedRunningTime="2026-03-21 04:28:05.199321506 +0000 UTC m=+289.527108182" Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.428436 4839 csr.go:261] certificate signing request csr-wtcz9 is approved, waiting to be issued Mar 21 04:28:05 crc kubenswrapper[4839]: I0321 04:28:05.435475 4839 csr.go:257] certificate signing request csr-wtcz9 is issued Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.101744 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxrc8" event={"ID":"6513c45b-dd98-40b0-b69c-94db4d1c916e","Type":"ContainerStarted","Data":"f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4"} Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.104513 4839 generic.go:334] "Generic (PLEG): container finished" podID="a45deb0c-4247-4d23-86db-a897c7f7e7f2" containerID="4d013e774070ce075bd0baa030b45d638ec14fab41990f0c671aa0d311846927" exitCode=0 Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.104776 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567788-9snlp" event={"ID":"a45deb0c-4247-4d23-86db-a897c7f7e7f2","Type":"ContainerDied","Data":"4d013e774070ce075bd0baa030b45d638ec14fab41990f0c671aa0d311846927"} Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.124825 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mxrc8" podStartSLOduration=3.330857033 podStartE2EDuration="59.124808069s" podCreationTimestamp="2026-03-21 04:27:07 +0000 UTC" firstStartedPulling="2026-03-21 04:27:09.826772367 +0000 UTC m=+234.154559043" lastFinishedPulling="2026-03-21 04:28:05.620723393 +0000 UTC m=+289.948510079" observedRunningTime="2026-03-21 04:28:06.119902947 +0000 UTC m=+290.447689623" watchObservedRunningTime="2026-03-21 04:28:06.124808069 +0000 UTC m=+290.452594745" Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.379683 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.437035 4839 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-11 20:36:14.495149546 +0000 UTC Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.437101 4839 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7120h8m8.058052028s for next certificate rotation Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.482467 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kubelet-dir\") pod \"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef\" (UID: \"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef\") " Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.482654 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kube-api-access\") pod \"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef\" (UID: \"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef\") " Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.482665 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d62030f0-7d6f-46f7-83a2-a28fafe1a4ef" (UID: "d62030f0-7d6f-46f7-83a2-a28fafe1a4ef"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.483294 4839 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.491032 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d62030f0-7d6f-46f7-83a2-a28fafe1a4ef" (UID: "d62030f0-7d6f-46f7-83a2-a28fafe1a4ef"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:28:06 crc kubenswrapper[4839]: I0321 04:28:06.584588 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d62030f0-7d6f-46f7-83a2-a28fafe1a4ef-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.112947 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" event={"ID":"609ace61-45d1-44f6-b378-fb97eecf2374","Type":"ContainerStarted","Data":"de6f2a80d57a636d18226b6f51d6ae0c6746d29df097ca4fd364524695c212fc"} Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.117835 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d62030f0-7d6f-46f7-83a2-a28fafe1a4ef","Type":"ContainerDied","Data":"068c125d2ab1d568fb725b9241f548239afcbec04a0fa6f6362a280d379a40c3"} Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.117898 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="068c125d2ab1d568fb725b9241f548239afcbec04a0fa6f6362a280d379a40c3" Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.117948 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.421242 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567788-9snlp" Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.437099 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" podStartSLOduration=62.825881119 podStartE2EDuration="2m7.437080329s" podCreationTimestamp="2026-03-21 04:26:00 +0000 UTC" firstStartedPulling="2026-03-21 04:27:02.143648313 +0000 UTC m=+226.471434989" lastFinishedPulling="2026-03-21 04:28:06.754847523 +0000 UTC m=+291.082634199" observedRunningTime="2026-03-21 04:28:07.131075108 +0000 UTC m=+291.458861794" watchObservedRunningTime="2026-03-21 04:28:07.437080329 +0000 UTC m=+291.764867005" Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.437333 4839 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-19 08:14:04.031510178 +0000 UTC Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.437355 4839 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6555h45m56.594157792s for next certificate rotation Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.601110 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f67r4\" (UniqueName: \"kubernetes.io/projected/a45deb0c-4247-4d23-86db-a897c7f7e7f2-kube-api-access-f67r4\") pod \"a45deb0c-4247-4d23-86db-a897c7f7e7f2\" (UID: \"a45deb0c-4247-4d23-86db-a897c7f7e7f2\") " Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.610332 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a45deb0c-4247-4d23-86db-a897c7f7e7f2-kube-api-access-f67r4" (OuterVolumeSpecName: "kube-api-access-f67r4") pod "a45deb0c-4247-4d23-86db-a897c7f7e7f2" (UID: "a45deb0c-4247-4d23-86db-a897c7f7e7f2"). InnerVolumeSpecName "kube-api-access-f67r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.694934 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.696116 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:28:07 crc kubenswrapper[4839]: I0321 04:28:07.703495 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f67r4\" (UniqueName: \"kubernetes.io/projected/a45deb0c-4247-4d23-86db-a897c7f7e7f2-kube-api-access-f67r4\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:08 crc kubenswrapper[4839]: I0321 04:28:08.124878 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567788-9snlp" event={"ID":"a45deb0c-4247-4d23-86db-a897c7f7e7f2","Type":"ContainerDied","Data":"7b7b577e4ba61820653dda6ae5bcf120e30feefac0d8c37751d6044474181784"} Mar 21 04:28:08 crc kubenswrapper[4839]: I0321 04:28:08.125292 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b7b577e4ba61820653dda6ae5bcf120e30feefac0d8c37751d6044474181784" Mar 21 04:28:08 crc kubenswrapper[4839]: I0321 04:28:08.124914 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567788-9snlp" Mar 21 04:28:08 crc kubenswrapper[4839]: I0321 04:28:08.127068 4839 generic.go:334] "Generic (PLEG): container finished" podID="609ace61-45d1-44f6-b378-fb97eecf2374" containerID="de6f2a80d57a636d18226b6f51d6ae0c6746d29df097ca4fd364524695c212fc" exitCode=0 Mar 21 04:28:08 crc kubenswrapper[4839]: I0321 04:28:08.127148 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" event={"ID":"609ace61-45d1-44f6-b378-fb97eecf2374","Type":"ContainerDied","Data":"de6f2a80d57a636d18226b6f51d6ae0c6746d29df097ca4fd364524695c212fc"} Mar 21 04:28:08 crc kubenswrapper[4839]: I0321 04:28:08.976469 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-mxrc8" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerName="registry-server" probeResult="failure" output=< Mar 21 04:28:08 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 04:28:08 crc kubenswrapper[4839]: > Mar 21 04:28:09 crc kubenswrapper[4839]: I0321 04:28:09.427795 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" Mar 21 04:28:09 crc kubenswrapper[4839]: I0321 04:28:09.535682 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjkdf\" (UniqueName: \"kubernetes.io/projected/609ace61-45d1-44f6-b378-fb97eecf2374-kube-api-access-vjkdf\") pod \"609ace61-45d1-44f6-b378-fb97eecf2374\" (UID: \"609ace61-45d1-44f6-b378-fb97eecf2374\") " Mar 21 04:28:09 crc kubenswrapper[4839]: I0321 04:28:09.541904 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/609ace61-45d1-44f6-b378-fb97eecf2374-kube-api-access-vjkdf" (OuterVolumeSpecName: "kube-api-access-vjkdf") pod "609ace61-45d1-44f6-b378-fb97eecf2374" (UID: "609ace61-45d1-44f6-b378-fb97eecf2374"). InnerVolumeSpecName "kube-api-access-vjkdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:28:09 crc kubenswrapper[4839]: I0321 04:28:09.637521 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjkdf\" (UniqueName: \"kubernetes.io/projected/609ace61-45d1-44f6-b378-fb97eecf2374-kube-api-access-vjkdf\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:10 crc kubenswrapper[4839]: I0321 04:28:10.143732 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" event={"ID":"609ace61-45d1-44f6-b378-fb97eecf2374","Type":"ContainerDied","Data":"3881096e968c291ccdd0e957e85d1c17697b418b86707f4eba8dd532d8654b50"} Mar 21 04:28:10 crc kubenswrapper[4839]: I0321 04:28:10.144108 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3881096e968c291ccdd0e957e85d1c17697b418b86707f4eba8dd532d8654b50" Mar 21 04:28:10 crc kubenswrapper[4839]: I0321 04:28:10.144742 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567786-d8w8k" Mar 21 04:28:14 crc kubenswrapper[4839]: I0321 04:28:14.165674 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4btp" event={"ID":"dc99f39a-8001-466b-acf1-bd106eb2b81d","Type":"ContainerStarted","Data":"29eeda5c5800bc8b98b9c7f0e11dfdbb6d941849d4928ef702fd78a8e69796aa"} Mar 21 04:28:15 crc kubenswrapper[4839]: I0321 04:28:15.174226 4839 generic.go:334] "Generic (PLEG): container finished" podID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerID="29eeda5c5800bc8b98b9c7f0e11dfdbb6d941849d4928ef702fd78a8e69796aa" exitCode=0 Mar 21 04:28:15 crc kubenswrapper[4839]: I0321 04:28:15.174271 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4btp" event={"ID":"dc99f39a-8001-466b-acf1-bd106eb2b81d","Type":"ContainerDied","Data":"29eeda5c5800bc8b98b9c7f0e11dfdbb6d941849d4928ef702fd78a8e69796aa"} Mar 21 04:28:17 crc kubenswrapper[4839]: I0321 04:28:17.762144 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:28:17 crc kubenswrapper[4839]: I0321 04:28:17.803239 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:28:19 crc kubenswrapper[4839]: I0321 04:28:19.197546 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7m29z" event={"ID":"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50","Type":"ContainerStarted","Data":"91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa"} Mar 21 04:28:19 crc kubenswrapper[4839]: I0321 04:28:19.199340 4839 generic.go:334] "Generic (PLEG): container finished" podID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerID="5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8" exitCode=0 Mar 21 04:28:19 crc kubenswrapper[4839]: I0321 04:28:19.199421 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrqj2" event={"ID":"f1ec80e5-557b-4c30-8323-87d6b1447a6d","Type":"ContainerDied","Data":"5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8"} Mar 21 04:28:19 crc kubenswrapper[4839]: I0321 04:28:19.202337 4839 generic.go:334] "Generic (PLEG): container finished" podID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerID="32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e" exitCode=0 Mar 21 04:28:19 crc kubenswrapper[4839]: I0321 04:28:19.202411 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qjgq" event={"ID":"0b7a7313-21c4-4909-9ebe-ebe552b29b8c","Type":"ContainerDied","Data":"32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e"} Mar 21 04:28:19 crc kubenswrapper[4839]: I0321 04:28:19.206557 4839 generic.go:334] "Generic (PLEG): container finished" podID="65a571df-f531-458b-9aed-6de99e4607e1" containerID="efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb" exitCode=0 Mar 21 04:28:19 crc kubenswrapper[4839]: I0321 04:28:19.206654 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nw7r6" event={"ID":"65a571df-f531-458b-9aed-6de99e4607e1","Type":"ContainerDied","Data":"efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb"} Mar 21 04:28:19 crc kubenswrapper[4839]: I0321 04:28:19.209561 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cznml" event={"ID":"b144748c-2940-4efe-a486-d2b5c1239b12","Type":"ContainerStarted","Data":"c0f3c8cd4904aa41a1b68ecefd439fa3aaa62843dac43bdac15ac235c5222357"} Mar 21 04:28:19 crc kubenswrapper[4839]: I0321 04:28:19.213734 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4btp" event={"ID":"dc99f39a-8001-466b-acf1-bd106eb2b81d","Type":"ContainerStarted","Data":"f0e777e6b17b8feadb828ef554e4eb57eaf696f3c9053a1cb52a0aa9d0e7f691"} Mar 21 04:28:19 crc kubenswrapper[4839]: I0321 04:28:19.306414 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v4btp" podStartSLOduration=3.818533605 podStartE2EDuration="1m12.30639409s" podCreationTimestamp="2026-03-21 04:27:07 +0000 UTC" firstStartedPulling="2026-03-21 04:27:09.835838258 +0000 UTC m=+234.163624934" lastFinishedPulling="2026-03-21 04:28:18.323698743 +0000 UTC m=+302.651485419" observedRunningTime="2026-03-21 04:28:19.304859969 +0000 UTC m=+303.632646665" watchObservedRunningTime="2026-03-21 04:28:19.30639409 +0000 UTC m=+303.634180766" Mar 21 04:28:20 crc kubenswrapper[4839]: I0321 04:28:20.222990 4839 generic.go:334] "Generic (PLEG): container finished" podID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerID="91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa" exitCode=0 Mar 21 04:28:20 crc kubenswrapper[4839]: I0321 04:28:20.223077 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7m29z" event={"ID":"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50","Type":"ContainerDied","Data":"91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa"} Mar 21 04:28:20 crc kubenswrapper[4839]: I0321 04:28:20.226359 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgfcm" event={"ID":"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c","Type":"ContainerStarted","Data":"7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680"} Mar 21 04:28:20 crc kubenswrapper[4839]: I0321 04:28:20.232964 4839 generic.go:334] "Generic (PLEG): container finished" podID="b144748c-2940-4efe-a486-d2b5c1239b12" containerID="c0f3c8cd4904aa41a1b68ecefd439fa3aaa62843dac43bdac15ac235c5222357" exitCode=0 Mar 21 04:28:20 crc kubenswrapper[4839]: I0321 04:28:20.233050 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cznml" event={"ID":"b144748c-2940-4efe-a486-d2b5c1239b12","Type":"ContainerDied","Data":"c0f3c8cd4904aa41a1b68ecefd439fa3aaa62843dac43bdac15ac235c5222357"} Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.241595 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qjgq" event={"ID":"0b7a7313-21c4-4909-9ebe-ebe552b29b8c","Type":"ContainerStarted","Data":"afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a"} Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.243814 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrqj2" event={"ID":"f1ec80e5-557b-4c30-8323-87d6b1447a6d","Type":"ContainerStarted","Data":"ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8"} Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.247638 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nw7r6" event={"ID":"65a571df-f531-458b-9aed-6de99e4607e1","Type":"ContainerStarted","Data":"3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4"} Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.250753 4839 generic.go:334] "Generic (PLEG): container finished" podID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerID="7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680" exitCode=0 Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.250846 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgfcm" event={"ID":"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c","Type":"ContainerDied","Data":"7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680"} Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.253122 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cznml" event={"ID":"b144748c-2940-4efe-a486-d2b5c1239b12","Type":"ContainerStarted","Data":"5ea3b3f4c3326a4aa81b311c0480c6c4bfb0954f54e7b1a0e142902f9a762cfa"} Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.272022 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9qjgq" podStartSLOduration=4.087990774 podStartE2EDuration="1m12.27200823s" podCreationTimestamp="2026-03-21 04:27:09 +0000 UTC" firstStartedPulling="2026-03-21 04:27:11.612063602 +0000 UTC m=+235.939850278" lastFinishedPulling="2026-03-21 04:28:19.796081058 +0000 UTC m=+304.123867734" observedRunningTime="2026-03-21 04:28:21.271255459 +0000 UTC m=+305.599042135" watchObservedRunningTime="2026-03-21 04:28:21.27200823 +0000 UTC m=+305.599794906" Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.335348 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nw7r6" podStartSLOduration=4.326745562 podStartE2EDuration="1m14.335326079s" podCreationTimestamp="2026-03-21 04:27:07 +0000 UTC" firstStartedPulling="2026-03-21 04:27:09.835517029 +0000 UTC m=+234.163303705" lastFinishedPulling="2026-03-21 04:28:19.844097546 +0000 UTC m=+304.171884222" observedRunningTime="2026-03-21 04:28:21.333555901 +0000 UTC m=+305.661342577" watchObservedRunningTime="2026-03-21 04:28:21.335326079 +0000 UTC m=+305.663112755" Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.337304 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qrqj2" podStartSLOduration=4.406572114 podStartE2EDuration="1m14.337291381s" podCreationTimestamp="2026-03-21 04:27:07 +0000 UTC" firstStartedPulling="2026-03-21 04:27:09.83587402 +0000 UTC m=+234.163660696" lastFinishedPulling="2026-03-21 04:28:19.766593287 +0000 UTC m=+304.094379963" observedRunningTime="2026-03-21 04:28:21.318354013 +0000 UTC m=+305.646140689" watchObservedRunningTime="2026-03-21 04:28:21.337291381 +0000 UTC m=+305.665078057" Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.351417 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cznml" podStartSLOduration=6.217478847 podStartE2EDuration="1m11.35140031s" podCreationTimestamp="2026-03-21 04:27:10 +0000 UTC" firstStartedPulling="2026-03-21 04:27:15.71332056 +0000 UTC m=+240.041107236" lastFinishedPulling="2026-03-21 04:28:20.847242023 +0000 UTC m=+305.175028699" observedRunningTime="2026-03-21 04:28:21.348764789 +0000 UTC m=+305.676551485" watchObservedRunningTime="2026-03-21 04:28:21.35140031 +0000 UTC m=+305.679186986" Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.599845 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:28:21 crc kubenswrapper[4839]: I0321 04:28:21.599902 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:28:22 crc kubenswrapper[4839]: I0321 04:28:22.643170 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cznml" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" containerName="registry-server" probeResult="failure" output=< Mar 21 04:28:22 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 04:28:22 crc kubenswrapper[4839]: > Mar 21 04:28:23 crc kubenswrapper[4839]: I0321 04:28:23.265729 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7m29z" event={"ID":"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50","Type":"ContainerStarted","Data":"1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4"} Mar 21 04:28:24 crc kubenswrapper[4839]: I0321 04:28:24.291634 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7m29z" podStartSLOduration=3.759663271 podStartE2EDuration="1m15.29160563s" podCreationTimestamp="2026-03-21 04:27:09 +0000 UTC" firstStartedPulling="2026-03-21 04:27:10.992514516 +0000 UTC m=+235.320301192" lastFinishedPulling="2026-03-21 04:28:22.524456875 +0000 UTC m=+306.852243551" observedRunningTime="2026-03-21 04:28:24.290430708 +0000 UTC m=+308.618217384" watchObservedRunningTime="2026-03-21 04:28:24.29160563 +0000 UTC m=+308.619392306" Mar 21 04:28:25 crc kubenswrapper[4839]: I0321 04:28:25.277134 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgfcm" event={"ID":"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c","Type":"ContainerStarted","Data":"e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817"} Mar 21 04:28:25 crc kubenswrapper[4839]: I0321 04:28:25.296838 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zgfcm" podStartSLOduration=4.948884055 podStartE2EDuration="1m15.296822261s" podCreationTimestamp="2026-03-21 04:27:10 +0000 UTC" firstStartedPulling="2026-03-21 04:27:13.68974055 +0000 UTC m=+238.017527236" lastFinishedPulling="2026-03-21 04:28:24.037678766 +0000 UTC m=+308.365465442" observedRunningTime="2026-03-21 04:28:25.296082301 +0000 UTC m=+309.623868977" watchObservedRunningTime="2026-03-21 04:28:25.296822261 +0000 UTC m=+309.624608937" Mar 21 04:28:26 crc kubenswrapper[4839]: I0321 04:28:26.705328 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64bdf84bc9-fblcp"] Mar 21 04:28:26 crc kubenswrapper[4839]: I0321 04:28:26.705551 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" podUID="602cd797-e549-4e26-a152-a1cb4decf82d" containerName="controller-manager" containerID="cri-o://daace327abb9a5740d790334ff3eab34c10f48214de659b79cea981c463ae614" gracePeriod=30 Mar 21 04:28:26 crc kubenswrapper[4839]: I0321 04:28:26.807283 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7"] Mar 21 04:28:26 crc kubenswrapper[4839]: I0321 04:28:26.807886 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" podUID="c4842010-e137-466d-9596-e65f0cf2f4da" containerName="route-controller-manager" containerID="cri-o://5caf7db86d82198e233c9af5a90d5b97d038e01ec61ad89ee60361b87114a642" gracePeriod=30 Mar 21 04:28:27 crc kubenswrapper[4839]: I0321 04:28:27.577475 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:28:27 crc kubenswrapper[4839]: I0321 04:28:27.577547 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:28:27 crc kubenswrapper[4839]: I0321 04:28:27.617894 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.000767 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.000931 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.037501 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.124794 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.124840 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.201434 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.306556 4839 generic.go:334] "Generic (PLEG): container finished" podID="c4842010-e137-466d-9596-e65f0cf2f4da" containerID="5caf7db86d82198e233c9af5a90d5b97d038e01ec61ad89ee60361b87114a642" exitCode=0 Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.306770 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" event={"ID":"c4842010-e137-466d-9596-e65f0cf2f4da","Type":"ContainerDied","Data":"5caf7db86d82198e233c9af5a90d5b97d038e01ec61ad89ee60361b87114a642"} Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.315345 4839 generic.go:334] "Generic (PLEG): container finished" podID="602cd797-e549-4e26-a152-a1cb4decf82d" containerID="daace327abb9a5740d790334ff3eab34c10f48214de659b79cea981c463ae614" exitCode=0 Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.315744 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" event={"ID":"602cd797-e549-4e26-a152-a1cb4decf82d","Type":"ContainerDied","Data":"daace327abb9a5740d790334ff3eab34c10f48214de659b79cea981c463ae614"} Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.358807 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.364105 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.386464 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.888937 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v4btp"] Mar 21 04:28:28 crc kubenswrapper[4839]: I0321 04:28:28.983716 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.013661 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-558f576774-7vr74"] Mar 21 04:28:29 crc kubenswrapper[4839]: E0321 04:28:29.013910 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45deb0c-4247-4d23-86db-a897c7f7e7f2" containerName="oc" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.014204 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45deb0c-4247-4d23-86db-a897c7f7e7f2" containerName="oc" Mar 21 04:28:29 crc kubenswrapper[4839]: E0321 04:28:29.014231 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62030f0-7d6f-46f7-83a2-a28fafe1a4ef" containerName="pruner" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.014237 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62030f0-7d6f-46f7-83a2-a28fafe1a4ef" containerName="pruner" Mar 21 04:28:29 crc kubenswrapper[4839]: E0321 04:28:29.014254 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="602cd797-e549-4e26-a152-a1cb4decf82d" containerName="controller-manager" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.014260 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="602cd797-e549-4e26-a152-a1cb4decf82d" containerName="controller-manager" Mar 21 04:28:29 crc kubenswrapper[4839]: E0321 04:28:29.014267 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="609ace61-45d1-44f6-b378-fb97eecf2374" containerName="oc" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.014273 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="609ace61-45d1-44f6-b378-fb97eecf2374" containerName="oc" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.014366 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62030f0-7d6f-46f7-83a2-a28fafe1a4ef" containerName="pruner" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.014376 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="a45deb0c-4247-4d23-86db-a897c7f7e7f2" containerName="oc" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.014387 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="602cd797-e549-4e26-a152-a1cb4decf82d" containerName="controller-manager" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.014632 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="609ace61-45d1-44f6-b378-fb97eecf2374" containerName="oc" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.015163 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.023237 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-558f576774-7vr74"] Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.076825 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.119068 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s54qc\" (UniqueName: \"kubernetes.io/projected/602cd797-e549-4e26-a152-a1cb4decf82d-kube-api-access-s54qc\") pod \"602cd797-e549-4e26-a152-a1cb4decf82d\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.119418 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4842010-e137-466d-9596-e65f0cf2f4da-serving-cert\") pod \"c4842010-e137-466d-9596-e65f0cf2f4da\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.119609 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9x2w\" (UniqueName: \"kubernetes.io/projected/c4842010-e137-466d-9596-e65f0cf2f4da-kube-api-access-k9x2w\") pod \"c4842010-e137-466d-9596-e65f0cf2f4da\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.119794 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-client-ca\") pod \"602cd797-e549-4e26-a152-a1cb4decf82d\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.119911 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-config\") pod \"c4842010-e137-466d-9596-e65f0cf2f4da\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.120039 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/602cd797-e549-4e26-a152-a1cb4decf82d-serving-cert\") pod \"602cd797-e549-4e26-a152-a1cb4decf82d\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.120175 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-proxy-ca-bundles\") pod \"602cd797-e549-4e26-a152-a1cb4decf82d\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.120278 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-client-ca\") pod \"c4842010-e137-466d-9596-e65f0cf2f4da\" (UID: \"c4842010-e137-466d-9596-e65f0cf2f4da\") " Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.120415 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-config\") pod \"602cd797-e549-4e26-a152-a1cb4decf82d\" (UID: \"602cd797-e549-4e26-a152-a1cb4decf82d\") " Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.120634 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-config" (OuterVolumeSpecName: "config") pod "c4842010-e137-466d-9596-e65f0cf2f4da" (UID: "c4842010-e137-466d-9596-e65f0cf2f4da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.120792 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-client-ca\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.120919 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-serving-cert\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.120788 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-client-ca" (OuterVolumeSpecName: "client-ca") pod "c4842010-e137-466d-9596-e65f0cf2f4da" (UID: "c4842010-e137-466d-9596-e65f0cf2f4da"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.120977 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "602cd797-e549-4e26-a152-a1cb4decf82d" (UID: "602cd797-e549-4e26-a152-a1cb4decf82d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.121002 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-client-ca" (OuterVolumeSpecName: "client-ca") pod "602cd797-e549-4e26-a152-a1cb4decf82d" (UID: "602cd797-e549-4e26-a152-a1cb4decf82d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.121291 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-config\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.121408 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dngsm\" (UniqueName: \"kubernetes.io/projected/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-kube-api-access-dngsm\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.121531 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-proxy-ca-bundles\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.121496 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-config" (OuterVolumeSpecName: "config") pod "602cd797-e549-4e26-a152-a1cb4decf82d" (UID: "602cd797-e549-4e26-a152-a1cb4decf82d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.121737 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.121852 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.121883 4839 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.121907 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4842010-e137-466d-9596-e65f0cf2f4da-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.126070 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/602cd797-e549-4e26-a152-a1cb4decf82d-kube-api-access-s54qc" (OuterVolumeSpecName: "kube-api-access-s54qc") pod "602cd797-e549-4e26-a152-a1cb4decf82d" (UID: "602cd797-e549-4e26-a152-a1cb4decf82d"). InnerVolumeSpecName "kube-api-access-s54qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.126169 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4842010-e137-466d-9596-e65f0cf2f4da-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c4842010-e137-466d-9596-e65f0cf2f4da" (UID: "c4842010-e137-466d-9596-e65f0cf2f4da"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.128456 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4842010-e137-466d-9596-e65f0cf2f4da-kube-api-access-k9x2w" (OuterVolumeSpecName: "kube-api-access-k9x2w") pod "c4842010-e137-466d-9596-e65f0cf2f4da" (UID: "c4842010-e137-466d-9596-e65f0cf2f4da"). InnerVolumeSpecName "kube-api-access-k9x2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.130982 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/602cd797-e549-4e26-a152-a1cb4decf82d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "602cd797-e549-4e26-a152-a1cb4decf82d" (UID: "602cd797-e549-4e26-a152-a1cb4decf82d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.222686 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dngsm\" (UniqueName: \"kubernetes.io/projected/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-kube-api-access-dngsm\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.223062 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-proxy-ca-bundles\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.223208 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-client-ca\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.223323 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-serving-cert\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.223456 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-config\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.223624 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/602cd797-e549-4e26-a152-a1cb4decf82d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.223726 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/602cd797-e549-4e26-a152-a1cb4decf82d-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.223839 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s54qc\" (UniqueName: \"kubernetes.io/projected/602cd797-e549-4e26-a152-a1cb4decf82d-kube-api-access-s54qc\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.223920 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9x2w\" (UniqueName: \"kubernetes.io/projected/c4842010-e137-466d-9596-e65f0cf2f4da-kube-api-access-k9x2w\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.223989 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4842010-e137-466d-9596-e65f0cf2f4da-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.224603 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-proxy-ca-bundles\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.224844 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-config\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.225404 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-client-ca\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.229703 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-serving-cert\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.242839 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dngsm\" (UniqueName: \"kubernetes.io/projected/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-kube-api-access-dngsm\") pod \"controller-manager-558f576774-7vr74\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.329931 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.329940 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7" event={"ID":"c4842010-e137-466d-9596-e65f0cf2f4da","Type":"ContainerDied","Data":"8c3849bbe60c3ca3e53c5a7956c8fe11b773acab42338cd4566762648f52546e"} Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.330660 4839 scope.go:117] "RemoveContainer" containerID="5caf7db86d82198e233c9af5a90d5b97d038e01ec61ad89ee60361b87114a642" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.334075 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" event={"ID":"602cd797-e549-4e26-a152-a1cb4decf82d","Type":"ContainerDied","Data":"37880a6725bfc4b97b6547a22128a884c967634f4fc94acf8976fa9f68b2ab94"} Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.334742 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64bdf84bc9-fblcp" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.363716 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7"] Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.366542 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bbccd6fb-qndh7"] Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.374744 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64bdf84bc9-fblcp"] Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.375113 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.375182 4839 scope.go:117] "RemoveContainer" containerID="daace327abb9a5740d790334ff3eab34c10f48214de659b79cea981c463ae614" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.380081 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-64bdf84bc9-fblcp"] Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.684953 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.685292 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.728423 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:28:29 crc kubenswrapper[4839]: I0321 04:28:29.828675 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-558f576774-7vr74"] Mar 21 04:28:29 crc kubenswrapper[4839]: W0321 04:28:29.839296 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc56aba9c_2ad5_4635_b6ad_eac6f79054c3.slice/crio-d7388d39aa83eaf8d537d9f043477f7e68a9f0a64d46f47372fa63ac019341e1 WatchSource:0}: Error finding container d7388d39aa83eaf8d537d9f043477f7e68a9f0a64d46f47372fa63ac019341e1: Status 404 returned error can't find the container with id d7388d39aa83eaf8d537d9f043477f7e68a9f0a64d46f47372fa63ac019341e1 Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.139103 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.139180 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.174829 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.343393 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v4btp" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerName="registry-server" containerID="cri-o://f0e777e6b17b8feadb828ef554e4eb57eaf696f3c9053a1cb52a0aa9d0e7f691" gracePeriod=2 Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.344038 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" event={"ID":"c56aba9c-2ad5-4635-b6ad-eac6f79054c3","Type":"ContainerStarted","Data":"d7388d39aa83eaf8d537d9f043477f7e68a9f0a64d46f47372fa63ac019341e1"} Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.393129 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.394185 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.460735 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="602cd797-e549-4e26-a152-a1cb4decf82d" path="/var/lib/kubelet/pods/602cd797-e549-4e26-a152-a1cb4decf82d/volumes" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.461394 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4842010-e137-466d-9596-e65f0cf2f4da" path="/var/lib/kubelet/pods/c4842010-e137-466d-9596-e65f0cf2f4da/volumes" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.723930 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.723991 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.979891 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.979973 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.980070 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.980829 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:28:30 crc kubenswrapper[4839]: I0321 04:28:30.980927 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311" gracePeriod=600 Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.288293 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qrqj2"] Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.350192 4839 generic.go:334] "Generic (PLEG): container finished" podID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerID="f0e777e6b17b8feadb828ef554e4eb57eaf696f3c9053a1cb52a0aa9d0e7f691" exitCode=0 Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.350260 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4btp" event={"ID":"dc99f39a-8001-466b-acf1-bd106eb2b81d","Type":"ContainerDied","Data":"f0e777e6b17b8feadb828ef554e4eb57eaf696f3c9053a1cb52a0aa9d0e7f691"} Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.351778 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" event={"ID":"c56aba9c-2ad5-4635-b6ad-eac6f79054c3","Type":"ContainerStarted","Data":"42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf"} Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.352064 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qrqj2" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerName="registry-server" containerID="cri-o://ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8" gracePeriod=2 Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.635678 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.681702 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.784639 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zgfcm" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerName="registry-server" probeResult="failure" output=< Mar 21 04:28:31 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 04:28:31 crc kubenswrapper[4839]: > Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.887751 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25"] Mar 21 04:28:31 crc kubenswrapper[4839]: E0321 04:28:31.888378 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4842010-e137-466d-9596-e65f0cf2f4da" containerName="route-controller-manager" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.888402 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4842010-e137-466d-9596-e65f0cf2f4da" containerName="route-controller-manager" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.888521 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4842010-e137-466d-9596-e65f0cf2f4da" containerName="route-controller-manager" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.889107 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.890723 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.891006 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.891129 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.891250 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.891345 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.891458 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.893489 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-client-ca\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.893524 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e226a30-c23d-4a45-ab06-4087bf0a38c7-serving-cert\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.893563 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lssj5\" (UniqueName: \"kubernetes.io/projected/2e226a30-c23d-4a45-ab06-4087bf0a38c7-kube-api-access-lssj5\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.893627 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-config\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.896981 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25"] Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.974762 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.994114 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-catalog-content\") pod \"dc99f39a-8001-466b-acf1-bd106eb2b81d\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.994193 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-utilities\") pod \"dc99f39a-8001-466b-acf1-bd106eb2b81d\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.994216 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9sw7\" (UniqueName: \"kubernetes.io/projected/dc99f39a-8001-466b-acf1-bd106eb2b81d-kube-api-access-w9sw7\") pod \"dc99f39a-8001-466b-acf1-bd106eb2b81d\" (UID: \"dc99f39a-8001-466b-acf1-bd106eb2b81d\") " Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.995157 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-config\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.995245 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-client-ca\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.995270 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e226a30-c23d-4a45-ab06-4087bf0a38c7-serving-cert\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.995306 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lssj5\" (UniqueName: \"kubernetes.io/projected/2e226a30-c23d-4a45-ab06-4087bf0a38c7-kube-api-access-lssj5\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.995381 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-utilities" (OuterVolumeSpecName: "utilities") pod "dc99f39a-8001-466b-acf1-bd106eb2b81d" (UID: "dc99f39a-8001-466b-acf1-bd106eb2b81d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.996346 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-client-ca\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.996734 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-config\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:31 crc kubenswrapper[4839]: I0321 04:28:31.999872 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc99f39a-8001-466b-acf1-bd106eb2b81d-kube-api-access-w9sw7" (OuterVolumeSpecName: "kube-api-access-w9sw7") pod "dc99f39a-8001-466b-acf1-bd106eb2b81d" (UID: "dc99f39a-8001-466b-acf1-bd106eb2b81d"). InnerVolumeSpecName "kube-api-access-w9sw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.000412 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e226a30-c23d-4a45-ab06-4087bf0a38c7-serving-cert\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.018842 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lssj5\" (UniqueName: \"kubernetes.io/projected/2e226a30-c23d-4a45-ab06-4087bf0a38c7-kube-api-access-lssj5\") pod \"route-controller-manager-ff7874d5d-7kj25\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.063059 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc99f39a-8001-466b-acf1-bd106eb2b81d" (UID: "dc99f39a-8001-466b-acf1-bd106eb2b81d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.096355 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9sw7\" (UniqueName: \"kubernetes.io/projected/dc99f39a-8001-466b-acf1-bd106eb2b81d-kube-api-access-w9sw7\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.096398 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.096413 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc99f39a-8001-466b-acf1-bd106eb2b81d-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.267118 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.361158 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311" exitCode=0 Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.361309 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311"} Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.364471 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4btp" event={"ID":"dc99f39a-8001-466b-acf1-bd106eb2b81d","Type":"ContainerDied","Data":"a5776e6c987dacb310eadbac22656829eea56efcfa2c6987693a184baa498a40"} Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.364511 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v4btp" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.364526 4839 scope.go:117] "RemoveContainer" containerID="f0e777e6b17b8feadb828ef554e4eb57eaf696f3c9053a1cb52a0aa9d0e7f691" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.383279 4839 scope.go:117] "RemoveContainer" containerID="29eeda5c5800bc8b98b9c7f0e11dfdbb6d941849d4928ef702fd78a8e69796aa" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.388501 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" podStartSLOduration=6.38848198 podStartE2EDuration="6.38848198s" podCreationTimestamp="2026-03-21 04:28:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:28:32.384798951 +0000 UTC m=+316.712585637" watchObservedRunningTime="2026-03-21 04:28:32.38848198 +0000 UTC m=+316.716268656" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.415897 4839 scope.go:117] "RemoveContainer" containerID="87e7256c1b35efeb4f01906aa88cf63b70ae781a00455690c43c1caf1c568dc7" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.419816 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v4btp"] Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.438853 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v4btp"] Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.460003 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" path="/var/lib/kubelet/pods/dc99f39a-8001-466b-acf1-bd106eb2b81d/volumes" Mar 21 04:28:32 crc kubenswrapper[4839]: I0321 04:28:32.735475 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25"] Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.274643 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.312956 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-utilities\") pod \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.313076 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7dr2\" (UniqueName: \"kubernetes.io/projected/f1ec80e5-557b-4c30-8323-87d6b1447a6d-kube-api-access-z7dr2\") pod \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.313134 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-catalog-content\") pod \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\" (UID: \"f1ec80e5-557b-4c30-8323-87d6b1447a6d\") " Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.314004 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-utilities" (OuterVolumeSpecName: "utilities") pod "f1ec80e5-557b-4c30-8323-87d6b1447a6d" (UID: "f1ec80e5-557b-4c30-8323-87d6b1447a6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.323504 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1ec80e5-557b-4c30-8323-87d6b1447a6d-kube-api-access-z7dr2" (OuterVolumeSpecName: "kube-api-access-z7dr2") pod "f1ec80e5-557b-4c30-8323-87d6b1447a6d" (UID: "f1ec80e5-557b-4c30-8323-87d6b1447a6d"). InnerVolumeSpecName "kube-api-access-z7dr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.365798 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1ec80e5-557b-4c30-8323-87d6b1447a6d" (UID: "f1ec80e5-557b-4c30-8323-87d6b1447a6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.378709 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" event={"ID":"2e226a30-c23d-4a45-ab06-4087bf0a38c7","Type":"ContainerStarted","Data":"ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c"} Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.378798 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" event={"ID":"2e226a30-c23d-4a45-ab06-4087bf0a38c7","Type":"ContainerStarted","Data":"13aade652ede3403d6b69f04e499915fd3869f0f1415b459504a8ea3e6cac5bf"} Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.378855 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.391195 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"e09fc13ebec75e4a854ca3cecb49f40ab8a65cb0b655c2368ba9c14be11281c6"} Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.402977 4839 generic.go:334] "Generic (PLEG): container finished" podID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerID="ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8" exitCode=0 Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.403028 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrqj2" event={"ID":"f1ec80e5-557b-4c30-8323-87d6b1447a6d","Type":"ContainerDied","Data":"ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8"} Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.403101 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrqj2" event={"ID":"f1ec80e5-557b-4c30-8323-87d6b1447a6d","Type":"ContainerDied","Data":"45c4f382e92761207baa8e2c4160a24c616ae580e9d48096eb767d7eab157d90"} Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.403141 4839 scope.go:117] "RemoveContainer" containerID="ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.403047 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrqj2" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.411337 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" podStartSLOduration=7.411298024 podStartE2EDuration="7.411298024s" podCreationTimestamp="2026-03-21 04:28:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:28:33.399286162 +0000 UTC m=+317.727072838" watchObservedRunningTime="2026-03-21 04:28:33.411298024 +0000 UTC m=+317.739084700" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.414245 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.414300 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ec80e5-557b-4c30-8323-87d6b1447a6d-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.414314 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7dr2\" (UniqueName: \"kubernetes.io/projected/f1ec80e5-557b-4c30-8323-87d6b1447a6d-kube-api-access-z7dr2\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.422750 4839 scope.go:117] "RemoveContainer" containerID="5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.440361 4839 scope.go:117] "RemoveContainer" containerID="5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.457864 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qrqj2"] Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.462652 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qrqj2"] Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.487838 4839 scope.go:117] "RemoveContainer" containerID="ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8" Mar 21 04:28:33 crc kubenswrapper[4839]: E0321 04:28:33.491769 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8\": container with ID starting with ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8 not found: ID does not exist" containerID="ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.491832 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8"} err="failed to get container status \"ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8\": rpc error: code = NotFound desc = could not find container \"ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8\": container with ID starting with ff2ff5500daef64d772e498675382b547240590af237209b74ba0f37bf72bdc8 not found: ID does not exist" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.491865 4839 scope.go:117] "RemoveContainer" containerID="5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8" Mar 21 04:28:33 crc kubenswrapper[4839]: E0321 04:28:33.495789 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8\": container with ID starting with 5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8 not found: ID does not exist" containerID="5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.495836 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8"} err="failed to get container status \"5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8\": rpc error: code = NotFound desc = could not find container \"5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8\": container with ID starting with 5b02a43d09df8da1fb3663f2b16933f0e9d27646e4ffb38ae47dd8d3634049a8 not found: ID does not exist" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.495863 4839 scope.go:117] "RemoveContainer" containerID="5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250" Mar 21 04:28:33 crc kubenswrapper[4839]: E0321 04:28:33.496370 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250\": container with ID starting with 5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250 not found: ID does not exist" containerID="5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.496396 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250"} err="failed to get container status \"5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250\": rpc error: code = NotFound desc = could not find container \"5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250\": container with ID starting with 5d071283174381fba88bb02cf3efc908a9879b9241ab8bc6042c28d3491e2250 not found: ID does not exist" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.668444 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.695375 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7m29z"] Mar 21 04:28:33 crc kubenswrapper[4839]: I0321 04:28:33.695793 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7m29z" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerName="registry-server" containerID="cri-o://1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4" gracePeriod=2 Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.029805 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.225012 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57m7l\" (UniqueName: \"kubernetes.io/projected/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-kube-api-access-57m7l\") pod \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.225064 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-catalog-content\") pod \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.225145 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-utilities\") pod \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\" (UID: \"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50\") " Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.226139 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-utilities" (OuterVolumeSpecName: "utilities") pod "c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" (UID: "c3ae9e7a-784b-4a39-bd4e-10dbff65cd50"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.230470 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-kube-api-access-57m7l" (OuterVolumeSpecName: "kube-api-access-57m7l") pod "c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" (UID: "c3ae9e7a-784b-4a39-bd4e-10dbff65cd50"). InnerVolumeSpecName "kube-api-access-57m7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.254763 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" (UID: "c3ae9e7a-784b-4a39-bd4e-10dbff65cd50"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.327192 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.327226 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.327237 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57m7l\" (UniqueName: \"kubernetes.io/projected/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50-kube-api-access-57m7l\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.412033 4839 generic.go:334] "Generic (PLEG): container finished" podID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerID="1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4" exitCode=0 Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.412123 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7m29z" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.412147 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7m29z" event={"ID":"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50","Type":"ContainerDied","Data":"1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4"} Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.412202 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7m29z" event={"ID":"c3ae9e7a-784b-4a39-bd4e-10dbff65cd50","Type":"ContainerDied","Data":"ffa5ea5aab95eb6762df26e9e80adb2d4051bdda8b4098100f8d0af4408a8c40"} Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.412225 4839 scope.go:117] "RemoveContainer" containerID="1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.428667 4839 scope.go:117] "RemoveContainer" containerID="91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.437253 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7m29z"] Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.441461 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7m29z"] Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.445193 4839 scope.go:117] "RemoveContainer" containerID="7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.461035 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" path="/var/lib/kubelet/pods/c3ae9e7a-784b-4a39-bd4e-10dbff65cd50/volumes" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.461800 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" path="/var/lib/kubelet/pods/f1ec80e5-557b-4c30-8323-87d6b1447a6d/volumes" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.461824 4839 scope.go:117] "RemoveContainer" containerID="1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4" Mar 21 04:28:34 crc kubenswrapper[4839]: E0321 04:28:34.462177 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4\": container with ID starting with 1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4 not found: ID does not exist" containerID="1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.462212 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4"} err="failed to get container status \"1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4\": rpc error: code = NotFound desc = could not find container \"1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4\": container with ID starting with 1c80657e72742fec76c998465cd0458ce77bc25333804e766c33538d85b460a4 not found: ID does not exist" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.462238 4839 scope.go:117] "RemoveContainer" containerID="91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa" Mar 21 04:28:34 crc kubenswrapper[4839]: E0321 04:28:34.462631 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa\": container with ID starting with 91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa not found: ID does not exist" containerID="91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.462663 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa"} err="failed to get container status \"91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa\": rpc error: code = NotFound desc = could not find container \"91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa\": container with ID starting with 91243299e8fcd1dc886a5177da277836895624fb7961cb20f7e40640b95d90fa not found: ID does not exist" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.462682 4839 scope.go:117] "RemoveContainer" containerID="7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5" Mar 21 04:28:34 crc kubenswrapper[4839]: E0321 04:28:34.462958 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5\": container with ID starting with 7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5 not found: ID does not exist" containerID="7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5" Mar 21 04:28:34 crc kubenswrapper[4839]: I0321 04:28:34.462996 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5"} err="failed to get container status \"7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5\": rpc error: code = NotFound desc = could not find container \"7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5\": container with ID starting with 7521db13f86a9c3794fc986fd50597367465ad735540781bebd8b03f96db40d5 not found: ID does not exist" Mar 21 04:28:36 crc kubenswrapper[4839]: I0321 04:28:36.090396 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cznml"] Mar 21 04:28:36 crc kubenswrapper[4839]: I0321 04:28:36.090689 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cznml" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" containerName="registry-server" containerID="cri-o://5ea3b3f4c3326a4aa81b311c0480c6c4bfb0954f54e7b1a0e142902f9a762cfa" gracePeriod=2 Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.440538 4839 generic.go:334] "Generic (PLEG): container finished" podID="b144748c-2940-4efe-a486-d2b5c1239b12" containerID="5ea3b3f4c3326a4aa81b311c0480c6c4bfb0954f54e7b1a0e142902f9a762cfa" exitCode=0 Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.440617 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cznml" event={"ID":"b144748c-2940-4efe-a486-d2b5c1239b12","Type":"ContainerDied","Data":"5ea3b3f4c3326a4aa81b311c0480c6c4bfb0954f54e7b1a0e142902f9a762cfa"} Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.672147 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.821360 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d994r\" (UniqueName: \"kubernetes.io/projected/b144748c-2940-4efe-a486-d2b5c1239b12-kube-api-access-d994r\") pod \"b144748c-2940-4efe-a486-d2b5c1239b12\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.821468 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-catalog-content\") pod \"b144748c-2940-4efe-a486-d2b5c1239b12\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.821607 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-utilities\") pod \"b144748c-2940-4efe-a486-d2b5c1239b12\" (UID: \"b144748c-2940-4efe-a486-d2b5c1239b12\") " Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.822395 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-utilities" (OuterVolumeSpecName: "utilities") pod "b144748c-2940-4efe-a486-d2b5c1239b12" (UID: "b144748c-2940-4efe-a486-d2b5c1239b12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.827265 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b144748c-2940-4efe-a486-d2b5c1239b12-kube-api-access-d994r" (OuterVolumeSpecName: "kube-api-access-d994r") pod "b144748c-2940-4efe-a486-d2b5c1239b12" (UID: "b144748c-2940-4efe-a486-d2b5c1239b12"). InnerVolumeSpecName "kube-api-access-d994r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.923269 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.923314 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d994r\" (UniqueName: \"kubernetes.io/projected/b144748c-2940-4efe-a486-d2b5c1239b12-kube-api-access-d994r\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:37 crc kubenswrapper[4839]: I0321 04:28:37.952747 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b144748c-2940-4efe-a486-d2b5c1239b12" (UID: "b144748c-2940-4efe-a486-d2b5c1239b12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:28:38 crc kubenswrapper[4839]: I0321 04:28:38.025070 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b144748c-2940-4efe-a486-d2b5c1239b12-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:38 crc kubenswrapper[4839]: I0321 04:28:38.449600 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cznml" event={"ID":"b144748c-2940-4efe-a486-d2b5c1239b12","Type":"ContainerDied","Data":"9d0751bfec85855cd6ce730251b6d95d8b4de8c09e14303b2b6a9c1d9c1fd165"} Mar 21 04:28:38 crc kubenswrapper[4839]: I0321 04:28:38.449642 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cznml" Mar 21 04:28:38 crc kubenswrapper[4839]: I0321 04:28:38.450667 4839 scope.go:117] "RemoveContainer" containerID="5ea3b3f4c3326a4aa81b311c0480c6c4bfb0954f54e7b1a0e142902f9a762cfa" Mar 21 04:28:38 crc kubenswrapper[4839]: I0321 04:28:38.469136 4839 scope.go:117] "RemoveContainer" containerID="c0f3c8cd4904aa41a1b68ecefd439fa3aaa62843dac43bdac15ac235c5222357" Mar 21 04:28:38 crc kubenswrapper[4839]: I0321 04:28:38.500263 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cznml"] Mar 21 04:28:38 crc kubenswrapper[4839]: I0321 04:28:38.503155 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cznml"] Mar 21 04:28:38 crc kubenswrapper[4839]: I0321 04:28:38.515651 4839 scope.go:117] "RemoveContainer" containerID="d34aaed9fe2b229d5a38e871187b560f6a9b3aa6a74029511701966c2b92c3d3" Mar 21 04:28:39 crc kubenswrapper[4839]: I0321 04:28:39.375733 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:39 crc kubenswrapper[4839]: I0321 04:28:39.381180 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:28:39 crc kubenswrapper[4839]: I0321 04:28:39.818316 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zt77f"] Mar 21 04:28:40 crc kubenswrapper[4839]: I0321 04:28:40.460234 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" path="/var/lib/kubelet/pods/b144748c-2940-4efe-a486-d2b5c1239b12/volumes" Mar 21 04:28:40 crc kubenswrapper[4839]: I0321 04:28:40.759798 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:28:40 crc kubenswrapper[4839]: I0321 04:28:40.793666 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.712615 4839 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.713295 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529" gracePeriod=15 Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.713487 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87" gracePeriod=15 Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.713602 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898" gracePeriod=15 Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.713688 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a" gracePeriod=15 Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.713733 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07" gracePeriod=15 Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714204 4839 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714408 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714425 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714434 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714441 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714450 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714456 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714463 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714468 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714477 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerName="extract-content" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714483 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerName="extract-content" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714489 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerName="extract-content" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714496 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerName="extract-content" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714503 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714509 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714519 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerName="extract-content" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714524 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerName="extract-content" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714535 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714541 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714548 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714553 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714561 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714586 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714593 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" containerName="extract-content" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714598 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" containerName="extract-content" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714608 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerName="extract-utilities" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714614 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerName="extract-utilities" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714629 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714635 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714647 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerName="extract-utilities" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714674 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerName="extract-utilities" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714687 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714695 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714704 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714709 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714719 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" containerName="extract-utilities" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714725 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" containerName="extract-utilities" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714734 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714740 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.714750 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerName="extract-utilities" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714756 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerName="extract-utilities" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714839 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc99f39a-8001-466b-acf1-bd106eb2b81d" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714846 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714855 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714862 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714870 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714878 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714890 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714902 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ec80e5-557b-4c30-8323-87d6b1447a6d" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714912 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b144748c-2940-4efe-a486-d2b5c1239b12" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714919 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714929 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3ae9e7a-784b-4a39-bd4e-10dbff65cd50" containerName="registry-server" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.714940 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.715049 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.715057 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.715064 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.715070 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.715153 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.719222 4839 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.722450 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.732132 4839 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 21 04:28:41 crc kubenswrapper[4839]: E0321 04:28:41.760681 4839 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.877581 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.877657 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.877678 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.877730 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.877915 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.877967 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.878009 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.878036 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.978665 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.978723 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.978770 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.978795 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.978799 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.978833 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.978813 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.978859 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.978879 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.978925 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.979116 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.979149 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.979190 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.979197 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.979224 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:41 crc kubenswrapper[4839]: I0321 04:28:41.979429 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.062040 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:42 crc kubenswrapper[4839]: W0321 04:28:42.078320 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-94e01fbf6a44a892697bce003df25f4fb15786bcda5230b57805b1b2d3b70080 WatchSource:0}: Error finding container 94e01fbf6a44a892697bce003df25f4fb15786bcda5230b57805b1b2d3b70080: Status 404 returned error can't find the container with id 94e01fbf6a44a892697bce003df25f4fb15786bcda5230b57805b1b2d3b70080 Mar 21 04:28:42 crc kubenswrapper[4839]: E0321 04:28:42.081712 4839 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.188:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ec0d47c41cf10 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:28:42.080841488 +0000 UTC m=+326.408628164,LastTimestamp:2026-03-21 04:28:42.080841488 +0000 UTC m=+326.408628164,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.471430 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7087a9cc9ea0a71bf83c49dccad5914cb87b12f2c00337676a7c943aa8d8a9b6"} Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.471965 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"94e01fbf6a44a892697bce003df25f4fb15786bcda5230b57805b1b2d3b70080"} Mar 21 04:28:42 crc kubenswrapper[4839]: E0321 04:28:42.473414 4839 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.474450 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.476796 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.480925 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87" exitCode=0 Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.481015 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898" exitCode=0 Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.481026 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a" exitCode=0 Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.481036 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07" exitCode=2 Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.481097 4839 scope.go:117] "RemoveContainer" containerID="ecafd1a4266b72f369f4b53d8a423bb875c1bc9719282c517a8e999ad5aecc66" Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.487543 4839 generic.go:334] "Generic (PLEG): container finished" podID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" containerID="c850247b91b749f4d993a8c6034f93518caa14ae16f4055edfe77ec5dbf0002f" exitCode=0 Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.487615 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e3a71e7a-3ead-483f-8de2-9dbf3a336182","Type":"ContainerDied","Data":"c850247b91b749f4d993a8c6034f93518caa14ae16f4055edfe77ec5dbf0002f"} Mar 21 04:28:42 crc kubenswrapper[4839]: I0321 04:28:42.488453 4839 status_manager.go:851] "Failed to get status for pod" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:43 crc kubenswrapper[4839]: I0321 04:28:43.502240 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.108328 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.109332 4839 status_manager.go:851] "Failed to get status for pod" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.113690 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.114460 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.114927 4839 status_manager.go:851] "Failed to get status for pod" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.115466 4839 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207136 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207224 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-var-lock\") pod \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207268 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kube-api-access\") pod \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207285 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207309 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207348 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207372 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-var-lock" (OuterVolumeSpecName: "var-lock") pod "e3a71e7a-3ead-483f-8de2-9dbf3a336182" (UID: "e3a71e7a-3ead-483f-8de2-9dbf3a336182"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207399 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kubelet-dir\") pod \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\" (UID: \"e3a71e7a-3ead-483f-8de2-9dbf3a336182\") " Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207476 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207423 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e3a71e7a-3ead-483f-8de2-9dbf3a336182" (UID: "e3a71e7a-3ead-483f-8de2-9dbf3a336182"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207593 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207893 4839 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207905 4839 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-var-lock\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207915 4839 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207924 4839 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.207931 4839 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.213898 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e3a71e7a-3ead-483f-8de2-9dbf3a336182" (UID: "e3a71e7a-3ead-483f-8de2-9dbf3a336182"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.308993 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3a71e7a-3ead-483f-8de2-9dbf3a336182-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.460856 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.511454 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.512156 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529" exitCode=0 Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.512219 4839 scope.go:117] "RemoveContainer" containerID="412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.512350 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.513649 4839 status_manager.go:851] "Failed to get status for pod" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.514957 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e3a71e7a-3ead-483f-8de2-9dbf3a336182","Type":"ContainerDied","Data":"c7bf0d3551fb41779d1f34afa1d26b3709109fa26382cddc767b170e4665d502"} Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.514976 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.514992 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7bf0d3551fb41779d1f34afa1d26b3709109fa26382cddc767b170e4665d502" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.516094 4839 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.516399 4839 status_manager.go:851] "Failed to get status for pod" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.516638 4839 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.522808 4839 status_manager.go:851] "Failed to get status for pod" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.523040 4839 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.529475 4839 scope.go:117] "RemoveContainer" containerID="573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.542983 4839 scope.go:117] "RemoveContainer" containerID="6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.558292 4839 scope.go:117] "RemoveContainer" containerID="e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.575701 4839 scope.go:117] "RemoveContainer" containerID="7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.593507 4839 scope.go:117] "RemoveContainer" containerID="f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.613245 4839 scope.go:117] "RemoveContainer" containerID="412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87" Mar 21 04:28:44 crc kubenswrapper[4839]: E0321 04:28:44.613891 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\": container with ID starting with 412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87 not found: ID does not exist" containerID="412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.613966 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87"} err="failed to get container status \"412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\": rpc error: code = NotFound desc = could not find container \"412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87\": container with ID starting with 412472fce0e71838c2bb83101879b672e777dcd608bf27261a98261150d42e87 not found: ID does not exist" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.614006 4839 scope.go:117] "RemoveContainer" containerID="573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898" Mar 21 04:28:44 crc kubenswrapper[4839]: E0321 04:28:44.616127 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\": container with ID starting with 573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898 not found: ID does not exist" containerID="573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.616184 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898"} err="failed to get container status \"573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\": rpc error: code = NotFound desc = could not find container \"573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898\": container with ID starting with 573a241321770c9c676d83d8280f81eba0ad01a3e2f857be89d31316117b8898 not found: ID does not exist" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.616215 4839 scope.go:117] "RemoveContainer" containerID="6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a" Mar 21 04:28:44 crc kubenswrapper[4839]: E0321 04:28:44.616681 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\": container with ID starting with 6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a not found: ID does not exist" containerID="6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.616730 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a"} err="failed to get container status \"6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\": rpc error: code = NotFound desc = could not find container \"6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a\": container with ID starting with 6d9bd6ceaa85cd21af55d1d512ac910eaf9d1bcf5d0f10868cd22ac80b13f66a not found: ID does not exist" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.616762 4839 scope.go:117] "RemoveContainer" containerID="e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07" Mar 21 04:28:44 crc kubenswrapper[4839]: E0321 04:28:44.617415 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\": container with ID starting with e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07 not found: ID does not exist" containerID="e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.617457 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07"} err="failed to get container status \"e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\": rpc error: code = NotFound desc = could not find container \"e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07\": container with ID starting with e4fe4a6c00181492cd5389e34a9bf18d905f69b2f7467eb75dcee9992eeb0d07 not found: ID does not exist" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.617484 4839 scope.go:117] "RemoveContainer" containerID="7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529" Mar 21 04:28:44 crc kubenswrapper[4839]: E0321 04:28:44.617836 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\": container with ID starting with 7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529 not found: ID does not exist" containerID="7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.617868 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529"} err="failed to get container status \"7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\": rpc error: code = NotFound desc = could not find container \"7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529\": container with ID starting with 7299e7f513aff2d1d61b42cef3ed386843478716082a6e38d4d6d61d87f48529 not found: ID does not exist" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.617890 4839 scope.go:117] "RemoveContainer" containerID="f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649" Mar 21 04:28:44 crc kubenswrapper[4839]: E0321 04:28:44.618201 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\": container with ID starting with f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649 not found: ID does not exist" containerID="f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649" Mar 21 04:28:44 crc kubenswrapper[4839]: I0321 04:28:44.618242 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649"} err="failed to get container status \"f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\": rpc error: code = NotFound desc = could not find container \"f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649\": container with ID starting with f70600d5431425836d94025fc8166068d8c9f411d0750be07cd7de44575a9649 not found: ID does not exist" Mar 21 04:28:46 crc kubenswrapper[4839]: I0321 04:28:46.455277 4839 status_manager.go:851] "Failed to get status for pod" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:46 crc kubenswrapper[4839]: I0321 04:28:46.456019 4839 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:48 crc kubenswrapper[4839]: E0321 04:28:48.475018 4839 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:48 crc kubenswrapper[4839]: E0321 04:28:48.475683 4839 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:48 crc kubenswrapper[4839]: E0321 04:28:48.476019 4839 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:48 crc kubenswrapper[4839]: E0321 04:28:48.476344 4839 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:48 crc kubenswrapper[4839]: E0321 04:28:48.476729 4839 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:48 crc kubenswrapper[4839]: I0321 04:28:48.476765 4839 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 21 04:28:48 crc kubenswrapper[4839]: E0321 04:28:48.477049 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="200ms" Mar 21 04:28:48 crc kubenswrapper[4839]: E0321 04:28:48.678330 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="400ms" Mar 21 04:28:49 crc kubenswrapper[4839]: E0321 04:28:49.079881 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="800ms" Mar 21 04:28:49 crc kubenswrapper[4839]: E0321 04:28:49.881363 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="1.6s" Mar 21 04:28:51 crc kubenswrapper[4839]: I0321 04:28:51.443644 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:28:51 crc kubenswrapper[4839]: I0321 04:28:51.444058 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:28:51 crc kubenswrapper[4839]: W0321 04:28:51.445085 4839 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27244": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:28:51 crc kubenswrapper[4839]: E0321 04:28:51.445153 4839 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27244\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:28:51 crc kubenswrapper[4839]: W0321 04:28:51.445085 4839 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27242": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:28:51 crc kubenswrapper[4839]: E0321 04:28:51.445183 4839 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27242\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:28:51 crc kubenswrapper[4839]: E0321 04:28:51.482793 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="3.2s" Mar 21 04:28:51 crc kubenswrapper[4839]: I0321 04:28:51.545128 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:28:51 crc kubenswrapper[4839]: I0321 04:28:51.545199 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:28:51 crc kubenswrapper[4839]: W0321 04:28:51.545707 4839 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27242": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:28:51 crc kubenswrapper[4839]: E0321 04:28:51.545808 4839 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27242\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:28:51 crc kubenswrapper[4839]: E0321 04:28:51.793279 4839 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.188:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189ec0d47c41cf10 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-21 04:28:42.080841488 +0000 UTC m=+326.408628164,LastTimestamp:2026-03-21 04:28:42.080841488 +0000 UTC m=+326.408628164,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 21 04:28:52 crc kubenswrapper[4839]: E0321 04:28:52.444255 4839 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 21 04:28:52 crc kubenswrapper[4839]: E0321 04:28:52.444300 4839 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 21 04:28:52 crc kubenswrapper[4839]: E0321 04:28:52.444358 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:30:54.444334464 +0000 UTC m=+458.772121140 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 21 04:28:52 crc kubenswrapper[4839]: E0321 04:28:52.444374 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-21 04:30:54.444368295 +0000 UTC m=+458.772154971 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 21 04:28:52 crc kubenswrapper[4839]: E0321 04:28:52.546365 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 21 04:28:52 crc kubenswrapper[4839]: E0321 04:28:52.546374 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 21 04:28:52 crc kubenswrapper[4839]: W0321 04:28:52.546826 4839 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27242": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:28:52 crc kubenswrapper[4839]: E0321 04:28:52.546881 4839 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27242\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:28:53 crc kubenswrapper[4839]: E0321 04:28:53.546624 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 21 04:28:53 crc kubenswrapper[4839]: E0321 04:28:53.546656 4839 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 21 04:28:53 crc kubenswrapper[4839]: E0321 04:28:53.546673 4839 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Mar 21 04:28:53 crc kubenswrapper[4839]: E0321 04:28:53.546694 4839 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 21 04:28:53 crc kubenswrapper[4839]: E0321 04:28:53.546750 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-21 04:30:55.546731579 +0000 UTC m=+459.874518245 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Mar 21 04:28:53 crc kubenswrapper[4839]: E0321 04:28:53.546784 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-21 04:30:55.54675931 +0000 UTC m=+459.874545986 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 21 04:28:53 crc kubenswrapper[4839]: W0321 04:28:53.748180 4839 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27242": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:28:53 crc kubenswrapper[4839]: E0321 04:28:53.748257 4839 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27242\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:28:53 crc kubenswrapper[4839]: W0321 04:28:53.757331 4839 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27242": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:28:53 crc kubenswrapper[4839]: E0321 04:28:53.757408 4839 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27242\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:28:54 crc kubenswrapper[4839]: W0321 04:28:54.260003 4839 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27244": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:28:54 crc kubenswrapper[4839]: E0321 04:28:54.260125 4839 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27244\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:28:54 crc kubenswrapper[4839]: I0321 04:28:54.452000 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:54 crc kubenswrapper[4839]: I0321 04:28:54.452984 4839 status_manager.go:851] "Failed to get status for pod" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:54 crc kubenswrapper[4839]: I0321 04:28:54.470030 4839 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:28:54 crc kubenswrapper[4839]: I0321 04:28:54.470405 4839 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:28:54 crc kubenswrapper[4839]: E0321 04:28:54.470865 4839 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:54 crc kubenswrapper[4839]: I0321 04:28:54.471367 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:54 crc kubenswrapper[4839]: W0321 04:28:54.493773 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-d7d8f89ca680e0c0520bbe3f73e12a2ada4499f48b071573715e41bf75652be8 WatchSource:0}: Error finding container d7d8f89ca680e0c0520bbe3f73e12a2ada4499f48b071573715e41bf75652be8: Status 404 returned error can't find the container with id d7d8f89ca680e0c0520bbe3f73e12a2ada4499f48b071573715e41bf75652be8 Mar 21 04:28:54 crc kubenswrapper[4839]: I0321 04:28:54.570099 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d7d8f89ca680e0c0520bbe3f73e12a2ada4499f48b071573715e41bf75652be8"} Mar 21 04:28:54 crc kubenswrapper[4839]: E0321 04:28:54.684452 4839 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="6.4s" Mar 21 04:28:55 crc kubenswrapper[4839]: E0321 04:28:55.468352 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 21 04:28:55 crc kubenswrapper[4839]: I0321 04:28:55.575918 4839 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="d49c73e9f4625513421a7ac55a4f919881d80ead1c5df858527eec13f0b55cb2" exitCode=0 Mar 21 04:28:55 crc kubenswrapper[4839]: I0321 04:28:55.575973 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"d49c73e9f4625513421a7ac55a4f919881d80ead1c5df858527eec13f0b55cb2"} Mar 21 04:28:55 crc kubenswrapper[4839]: I0321 04:28:55.576219 4839 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:28:55 crc kubenswrapper[4839]: I0321 04:28:55.576606 4839 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:28:55 crc kubenswrapper[4839]: I0321 04:28:55.576722 4839 status_manager.go:851] "Failed to get status for pod" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Mar 21 04:28:55 crc kubenswrapper[4839]: E0321 04:28:55.577013 4839 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:55 crc kubenswrapper[4839]: W0321 04:28:55.680975 4839 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27242": dial tcp 38.102.83.188:6443: connect: connection refused Mar 21 04:28:55 crc kubenswrapper[4839]: E0321 04:28:55.681031 4839 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27242\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Mar 21 04:28:56 crc kubenswrapper[4839]: E0321 04:28:56.471403 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 21 04:28:56 crc kubenswrapper[4839]: E0321 04:28:56.499423 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 21 04:28:56 crc kubenswrapper[4839]: I0321 04:28:56.587131 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cd0c768fd3fc7669df9ff99977152e0a55f1eff773769ca371580e0fd59e5587"} Mar 21 04:28:56 crc kubenswrapper[4839]: I0321 04:28:56.587192 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bde8d34b587c2d23e05a83f6a08f5c7a5abc03ac5af7b64f36e36b748e3d494d"} Mar 21 04:28:56 crc kubenswrapper[4839]: I0321 04:28:56.587207 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d38cb0ea447e009dbf47aa1847902929beefa1516c957583d83968e314bfe607"} Mar 21 04:28:56 crc kubenswrapper[4839]: I0321 04:28:56.587219 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1bb6a44266fbf9a1f46069e2559c9f6c5efaa916f94a462779947140b2624181"} Mar 21 04:28:56 crc kubenswrapper[4839]: I0321 04:28:56.591702 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 21 04:28:56 crc kubenswrapper[4839]: I0321 04:28:56.593111 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 21 04:28:56 crc kubenswrapper[4839]: I0321 04:28:56.593162 4839 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c" exitCode=1 Mar 21 04:28:56 crc kubenswrapper[4839]: I0321 04:28:56.593210 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c"} Mar 21 04:28:56 crc kubenswrapper[4839]: I0321 04:28:56.593890 4839 scope.go:117] "RemoveContainer" containerID="dd3ce831e80df764c5450f1bf45e92fc061ba1c1efa8ade627b570edcb41567c" Mar 21 04:28:57 crc kubenswrapper[4839]: I0321 04:28:57.003025 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:28:57 crc kubenswrapper[4839]: I0321 04:28:57.602257 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b3c0e32beebd917f01eaea8197f62ad832a1a8697b7f81f9f5ddaabf1b01d25f"} Mar 21 04:28:57 crc kubenswrapper[4839]: I0321 04:28:57.602505 4839 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:28:57 crc kubenswrapper[4839]: I0321 04:28:57.602519 4839 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:28:57 crc kubenswrapper[4839]: I0321 04:28:57.602893 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:57 crc kubenswrapper[4839]: I0321 04:28:57.604949 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 21 04:28:57 crc kubenswrapper[4839]: I0321 04:28:57.605895 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 21 04:28:57 crc kubenswrapper[4839]: I0321 04:28:57.605935 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4ea3e2d51f5d6c22330bdcd78365352b43f67ec780cfe8d52f618f414340d6f6"} Mar 21 04:28:59 crc kubenswrapper[4839]: I0321 04:28:59.069456 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 21 04:28:59 crc kubenswrapper[4839]: I0321 04:28:59.472839 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:59 crc kubenswrapper[4839]: I0321 04:28:59.473093 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:59 crc kubenswrapper[4839]: I0321 04:28:59.479855 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:28:59 crc kubenswrapper[4839]: I0321 04:28:59.821192 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:28:59 crc kubenswrapper[4839]: I0321 04:28:59.824976 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:29:00 crc kubenswrapper[4839]: I0321 04:29:00.101642 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 21 04:29:00 crc kubenswrapper[4839]: I0321 04:29:00.122082 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 21 04:29:00 crc kubenswrapper[4839]: I0321 04:29:00.621978 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:29:02 crc kubenswrapper[4839]: I0321 04:29:02.764869 4839 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:29:02 crc kubenswrapper[4839]: I0321 04:29:02.800100 4839 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5a473f2-0430-4c9b-8ef8-60d457db5188\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:28:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:28:55Z\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:28:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-cert-syncer kube-apiserver-cert-regeneration-controller kube-apiserver-insecure-readyz kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-21T04:28:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-cert-syncer kube-apiserver-cert-regeneration-controller kube-apiserver-insecure-readyz kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}}}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d49c73e9f4625513421a7ac55a4f919881d80ead1c5df858527eec13f0b55cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49c73e9f4625513421a7ac55a4f919881d80ead1c5df858527eec13f0b55cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-21T04:28:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-21T04:28:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Pending\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Pod \"kube-apiserver-crc\" is invalid: metadata.uid: Invalid value: \"c5a473f2-0430-4c9b-8ef8-60d457db5188\": field is immutable" Mar 21 04:29:02 crc kubenswrapper[4839]: I0321 04:29:02.819679 4839 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6516779a-c1a6-4f61-8475-35a9bee3aed1" Mar 21 04:29:03 crc kubenswrapper[4839]: I0321 04:29:03.413741 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 21 04:29:03 crc kubenswrapper[4839]: I0321 04:29:03.637631 4839 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:29:03 crc kubenswrapper[4839]: I0321 04:29:03.637670 4839 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:29:03 crc kubenswrapper[4839]: I0321 04:29:03.640871 4839 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6516779a-c1a6-4f61-8475-35a9bee3aed1" Mar 21 04:29:03 crc kubenswrapper[4839]: I0321 04:29:03.642238 4839 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://1bb6a44266fbf9a1f46069e2559c9f6c5efaa916f94a462779947140b2624181" Mar 21 04:29:03 crc kubenswrapper[4839]: I0321 04:29:03.642274 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:29:04 crc kubenswrapper[4839]: I0321 04:29:04.642561 4839 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:29:04 crc kubenswrapper[4839]: I0321 04:29:04.642610 4839 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:29:04 crc kubenswrapper[4839]: I0321 04:29:04.646027 4839 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6516779a-c1a6-4f61-8475-35a9bee3aed1" Mar 21 04:29:04 crc kubenswrapper[4839]: I0321 04:29:04.846251 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" podUID="6bfbd19d-a44a-459c-bd6e-150241ce3ebb" containerName="oauth-openshift" containerID="cri-o://5917d0257c4a81565499bf920cd6ba405e8b8d34fcd640889d185cadd9ae650d" gracePeriod=15 Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.649401 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" event={"ID":"6bfbd19d-a44a-459c-bd6e-150241ce3ebb","Type":"ContainerDied","Data":"5917d0257c4a81565499bf920cd6ba405e8b8d34fcd640889d185cadd9ae650d"} Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.649530 4839 generic.go:334] "Generic (PLEG): container finished" podID="6bfbd19d-a44a-459c-bd6e-150241ce3ebb" containerID="5917d0257c4a81565499bf920cd6ba405e8b8d34fcd640889d185cadd9ae650d" exitCode=0 Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.850738 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.928427 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-trusted-ca-bundle\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.928481 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-serving-cert\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.928500 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-error\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.928522 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-login\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.928546 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-session\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.929561 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930012 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-provider-selection\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930080 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-cliconfig\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930106 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5ktq\" (UniqueName: \"kubernetes.io/projected/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-kube-api-access-z5ktq\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930136 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-policies\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930195 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-dir\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930223 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-router-certs\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930249 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-service-ca\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930265 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-idp-0-file-data\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930360 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-ocp-branding-template\") pod \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\" (UID: \"6bfbd19d-a44a-459c-bd6e-150241ce3ebb\") " Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930552 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930649 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930669 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930555 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.930834 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.931352 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.934697 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.934948 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-kube-api-access-z5ktq" (OuterVolumeSpecName: "kube-api-access-z5ktq") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "kube-api-access-z5ktq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.934962 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.935186 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.935423 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.935591 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.935719 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.935881 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:05 crc kubenswrapper[4839]: I0321 04:29:05.936048 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "6bfbd19d-a44a-459c-bd6e-150241ce3ebb" (UID: "6bfbd19d-a44a-459c-bd6e-150241ce3ebb"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031189 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031220 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031229 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031238 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031248 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031257 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5ktq\" (UniqueName: \"kubernetes.io/projected/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-kube-api-access-z5ktq\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031266 4839 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031273 4839 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031282 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031290 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031300 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.031308 4839 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6bfbd19d-a44a-459c-bd6e-150241ce3ebb-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.655881 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" event={"ID":"6bfbd19d-a44a-459c-bd6e-150241ce3ebb","Type":"ContainerDied","Data":"acab2b98e2e3828439a02d33c7b3fd1855365edb0946861b3e5dc01800f9adfe"} Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.655942 4839 scope.go:117] "RemoveContainer" containerID="5917d0257c4a81565499bf920cd6ba405e8b8d34fcd640889d185cadd9ae650d" Mar 21 04:29:06 crc kubenswrapper[4839]: I0321 04:29:06.655953 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-zt77f" Mar 21 04:29:07 crc kubenswrapper[4839]: I0321 04:29:07.452421 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:29:07 crc kubenswrapper[4839]: I0321 04:29:07.452805 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:29:09 crc kubenswrapper[4839]: I0321 04:29:09.452611 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:29:11 crc kubenswrapper[4839]: I0321 04:29:11.887146 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 21 04:29:13 crc kubenswrapper[4839]: I0321 04:29:13.129610 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 21 04:29:13 crc kubenswrapper[4839]: I0321 04:29:13.186744 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 21 04:29:13 crc kubenswrapper[4839]: I0321 04:29:13.390461 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:29:13 crc kubenswrapper[4839]: I0321 04:29:13.392392 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 21 04:29:13 crc kubenswrapper[4839]: I0321 04:29:13.590949 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 21 04:29:13 crc kubenswrapper[4839]: I0321 04:29:13.603481 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 21 04:29:13 crc kubenswrapper[4839]: I0321 04:29:13.743044 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 21 04:29:13 crc kubenswrapper[4839]: I0321 04:29:13.828061 4839 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 21 04:29:14 crc kubenswrapper[4839]: I0321 04:29:14.097258 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 21 04:29:14 crc kubenswrapper[4839]: I0321 04:29:14.125237 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 21 04:29:14 crc kubenswrapper[4839]: I0321 04:29:14.388841 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 21 04:29:15 crc kubenswrapper[4839]: I0321 04:29:15.069079 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 21 04:29:15 crc kubenswrapper[4839]: I0321 04:29:15.147277 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 21 04:29:15 crc kubenswrapper[4839]: I0321 04:29:15.150674 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 21 04:29:15 crc kubenswrapper[4839]: I0321 04:29:15.435583 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 21 04:29:15 crc kubenswrapper[4839]: I0321 04:29:15.532351 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 21 04:29:15 crc kubenswrapper[4839]: I0321 04:29:15.736307 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 21 04:29:15 crc kubenswrapper[4839]: I0321 04:29:15.894421 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 21 04:29:15 crc kubenswrapper[4839]: I0321 04:29:15.976666 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 21 04:29:15 crc kubenswrapper[4839]: I0321 04:29:15.997211 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 21 04:29:16 crc kubenswrapper[4839]: I0321 04:29:16.077916 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 21 04:29:16 crc kubenswrapper[4839]: I0321 04:29:16.182791 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 21 04:29:16 crc kubenswrapper[4839]: I0321 04:29:16.182871 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 21 04:29:16 crc kubenswrapper[4839]: I0321 04:29:16.197353 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.441435 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.441645 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.441793 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.442061 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.442245 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.442492 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.442903 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.442967 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.444250 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.450313 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.450528 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.450714 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.453038 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.456763 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.456952 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.460510 4839 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.466929 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-zt77f","openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.467240 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-cb949b455-jr2d6","openshift-kube-apiserver/kube-apiserver-crc"] Mar 21 04:29:17 crc kubenswrapper[4839]: E0321 04:29:17.467457 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bfbd19d-a44a-459c-bd6e-150241ce3ebb" containerName="oauth-openshift" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.467478 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bfbd19d-a44a-459c-bd6e-150241ce3ebb" containerName="oauth-openshift" Mar 21 04:29:17 crc kubenswrapper[4839]: E0321 04:29:17.467502 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" containerName="installer" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.467510 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" containerName="installer" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.467646 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a71e7a-3ead-483f-8de2-9dbf3a336182" containerName="installer" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.467666 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bfbd19d-a44a-459c-bd6e-150241ce3ebb" containerName="oauth-openshift" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.468169 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.469065 4839 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.469089 4839 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c5a473f2-0430-4c9b-8ef8-60d457db5188" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.477131 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.477160 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.477374 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.477426 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.477527 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.477628 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.478047 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.478195 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.478327 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.478666 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.478741 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.484980 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.485241 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.489465 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.495155 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.503634 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.526495 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.526618 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-template-login\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.526661 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26phm\" (UniqueName: \"kubernetes.io/projected/02d1828c-4b4b-4d6e-994e-d1b383763960-kube-api-access-26phm\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.526821 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-template-error\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.526858 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-router-certs\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.526902 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-audit-policies\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.527032 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.527124 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-service-ca\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.527270 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.527373 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-cliconfig\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.527455 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-serving-cert\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.527548 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-session\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.527737 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/02d1828c-4b4b-4d6e-994e-d1b383763960-audit-dir\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.527803 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.542604 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=15.542583949 podStartE2EDuration="15.542583949s" podCreationTimestamp="2026-03-21 04:29:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:29:17.52339265 +0000 UTC m=+361.851179346" watchObservedRunningTime="2026-03-21 04:29:17.542583949 +0000 UTC m=+361.870370625" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.573589 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.604117 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.620996 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631251 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631651 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-template-login\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631695 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26phm\" (UniqueName: \"kubernetes.io/projected/02d1828c-4b4b-4d6e-994e-d1b383763960-kube-api-access-26phm\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631732 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-template-error\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631758 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-router-certs\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631789 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-audit-policies\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631809 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631831 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-service-ca\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631848 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631878 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-cliconfig\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631896 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-serving-cert\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631914 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-session\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631944 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/02d1828c-4b4b-4d6e-994e-d1b383763960-audit-dir\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631962 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.631980 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.632919 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/02d1828c-4b4b-4d6e-994e-d1b383763960-audit-dir\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.632975 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-cliconfig\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.633170 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-service-ca\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.634945 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-audit-policies\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.635413 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.639664 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-template-login\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.639921 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.642722 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.642894 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-router-certs\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.645538 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.649221 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-serving-cert\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.649772 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-user-template-error\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.650143 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/02d1828c-4b4b-4d6e-994e-d1b383763960-v4-0-config-system-session\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.654876 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26phm\" (UniqueName: \"kubernetes.io/projected/02d1828c-4b4b-4d6e-994e-d1b383763960-kube-api-access-26phm\") pod \"oauth-openshift-cb949b455-jr2d6\" (UID: \"02d1828c-4b4b-4d6e-994e-d1b383763960\") " pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.715389 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.723683 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.750580 4839 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.769234 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.798596 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.839487 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.898670 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.947386 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 21 04:29:17 crc kubenswrapper[4839]: I0321 04:29:17.953477 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.028833 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.030456 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.045065 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.070099 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.117542 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.241292 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.277214 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.407092 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.458169 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bfbd19d-a44a-459c-bd6e-150241ce3ebb" path="/var/lib/kubelet/pods/6bfbd19d-a44a-459c-bd6e-150241ce3ebb/volumes" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.551011 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.556559 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.568262 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.683257 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.694040 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.739914 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.745899 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.777797 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.818600 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.854780 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.931638 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:29:18 crc kubenswrapper[4839]: I0321 04:29:18.968250 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.007483 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.207417 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.212951 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.257220 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.380478 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.404504 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.516341 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.569083 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.578797 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.680200 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.693279 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.745222 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.760209 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.766616 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 21 04:29:19 crc kubenswrapper[4839]: I0321 04:29:19.786877 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.005457 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.076030 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.109331 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.131337 4839 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.166894 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.179697 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.305408 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.333748 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.477229 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.500144 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.528653 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.642359 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.674358 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.676358 4839 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.803746 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.830988 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 21 04:29:20 crc kubenswrapper[4839]: I0321 04:29:20.874540 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.036731 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.074938 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.097964 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.101380 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.178983 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.186192 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.271338 4839 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.315598 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.405427 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.447955 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.492691 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.508447 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.548453 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.615714 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.629733 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.723488 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.769209 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.831962 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.893925 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 21 04:29:21 crc kubenswrapper[4839]: I0321 04:29:21.951346 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.000543 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.050092 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.051856 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.066768 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.081536 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.091143 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.135728 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-cb949b455-jr2d6"] Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.138500 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.188701 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.299153 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.313098 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.331030 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.331267 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.366424 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.398087 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.399219 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.421855 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.454072 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.553317 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.559102 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.575278 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.621304 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-cb949b455-jr2d6"] Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.656678 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.680849 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.811226 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.884466 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.958264 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 21 04:29:22 crc kubenswrapper[4839]: I0321 04:29:22.971348 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.012250 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.214388 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.354322 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.413898 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.423990 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.441229 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.495374 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-cb949b455-jr2d6_02d1828c-4b4b-4d6e-994e-d1b383763960/oauth-openshift/0.log" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.495416 4839 generic.go:334] "Generic (PLEG): container finished" podID="02d1828c-4b4b-4d6e-994e-d1b383763960" containerID="202d6e665354c94940665494d88d864de5f66f7f1da5473dcf378fd1f89c0d9d" exitCode=255 Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.495443 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" event={"ID":"02d1828c-4b4b-4d6e-994e-d1b383763960","Type":"ContainerDied","Data":"202d6e665354c94940665494d88d864de5f66f7f1da5473dcf378fd1f89c0d9d"} Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.495507 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" event={"ID":"02d1828c-4b4b-4d6e-994e-d1b383763960","Type":"ContainerStarted","Data":"ef601578549e3605d9337a01c412bc01a0063ded26fd0d3f4c5256f1ff349d54"} Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.496060 4839 scope.go:117] "RemoveContainer" containerID="202d6e665354c94940665494d88d864de5f66f7f1da5473dcf378fd1f89c0d9d" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.532235 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.545604 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.607524 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.642142 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.643561 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.677355 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.713311 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.726930 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.772903 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:29:23 crc kubenswrapper[4839]: I0321 04:29:23.872274 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.002451 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.030522 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.036597 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.056511 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.173286 4839 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.173921 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://7087a9cc9ea0a71bf83c49dccad5914cb87b12f2c00337676a7c943aa8d8a9b6" gracePeriod=5 Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.181795 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.257717 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.260988 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.340654 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.505555 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-cb949b455-jr2d6_02d1828c-4b4b-4d6e-994e-d1b383763960/oauth-openshift/1.log" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.506040 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-cb949b455-jr2d6_02d1828c-4b4b-4d6e-994e-d1b383763960/oauth-openshift/0.log" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.506092 4839 generic.go:334] "Generic (PLEG): container finished" podID="02d1828c-4b4b-4d6e-994e-d1b383763960" containerID="6dd6effaf30cd6d83c10fd64f9da93bee36a19a8a87078b69af4038ae06980b0" exitCode=255 Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.506127 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" event={"ID":"02d1828c-4b4b-4d6e-994e-d1b383763960","Type":"ContainerDied","Data":"6dd6effaf30cd6d83c10fd64f9da93bee36a19a8a87078b69af4038ae06980b0"} Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.506166 4839 scope.go:117] "RemoveContainer" containerID="202d6e665354c94940665494d88d864de5f66f7f1da5473dcf378fd1f89c0d9d" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.506602 4839 scope.go:117] "RemoveContainer" containerID="6dd6effaf30cd6d83c10fd64f9da93bee36a19a8a87078b69af4038ae06980b0" Mar 21 04:29:24 crc kubenswrapper[4839]: E0321 04:29:24.506890 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-cb949b455-jr2d6_openshift-authentication(02d1828c-4b4b-4d6e-994e-d1b383763960)\"" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" podUID="02d1828c-4b4b-4d6e-994e-d1b383763960" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.636600 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.654584 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.917892 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.939692 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 21 04:29:24 crc kubenswrapper[4839]: I0321 04:29:24.977737 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.114261 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.121268 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.132550 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.142039 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.172174 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.235552 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.238879 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.259076 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.327710 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.328896 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.360021 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.376605 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.514016 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-cb949b455-jr2d6_02d1828c-4b4b-4d6e-994e-d1b383763960/oauth-openshift/1.log" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.514616 4839 scope.go:117] "RemoveContainer" containerID="6dd6effaf30cd6d83c10fd64f9da93bee36a19a8a87078b69af4038ae06980b0" Mar 21 04:29:25 crc kubenswrapper[4839]: E0321 04:29:25.514826 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-cb949b455-jr2d6_openshift-authentication(02d1828c-4b4b-4d6e-994e-d1b383763960)\"" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" podUID="02d1828c-4b4b-4d6e-994e-d1b383763960" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.537303 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.574458 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.588898 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.775744 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.796199 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 21 04:29:25 crc kubenswrapper[4839]: I0321 04:29:25.955156 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.065924 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.116674 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.174399 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.228081 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.350121 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.389874 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.404074 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.553367 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.621396 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.639389 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.650531 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.662610 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.701261 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-558f576774-7vr74"] Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.701512 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" podUID="c56aba9c-2ad5-4635-b6ad-eac6f79054c3" containerName="controller-manager" containerID="cri-o://42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf" gracePeriod=30 Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.705318 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25"] Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.705517 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" podUID="2e226a30-c23d-4a45-ab06-4087bf0a38c7" containerName="route-controller-manager" containerID="cri-o://ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c" gracePeriod=30 Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.735443 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.754453 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.769407 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.785634 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 21 04:29:26 crc kubenswrapper[4839]: I0321 04:29:26.925241 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.161342 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.167075 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.359380 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-config\") pod \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.359457 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-client-ca\") pod \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.359493 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-serving-cert\") pod \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.359519 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-config\") pod \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.359602 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dngsm\" (UniqueName: \"kubernetes.io/projected/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-kube-api-access-dngsm\") pod \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.359679 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-client-ca\") pod \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.359704 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lssj5\" (UniqueName: \"kubernetes.io/projected/2e226a30-c23d-4a45-ab06-4087bf0a38c7-kube-api-access-lssj5\") pod \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.359738 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e226a30-c23d-4a45-ab06-4087bf0a38c7-serving-cert\") pod \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\" (UID: \"2e226a30-c23d-4a45-ab06-4087bf0a38c7\") " Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.359764 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-proxy-ca-bundles\") pod \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\" (UID: \"c56aba9c-2ad5-4635-b6ad-eac6f79054c3\") " Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.360126 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-client-ca" (OuterVolumeSpecName: "client-ca") pod "c56aba9c-2ad5-4635-b6ad-eac6f79054c3" (UID: "c56aba9c-2ad5-4635-b6ad-eac6f79054c3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.360526 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c56aba9c-2ad5-4635-b6ad-eac6f79054c3" (UID: "c56aba9c-2ad5-4635-b6ad-eac6f79054c3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.360534 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-config" (OuterVolumeSpecName: "config") pod "c56aba9c-2ad5-4635-b6ad-eac6f79054c3" (UID: "c56aba9c-2ad5-4635-b6ad-eac6f79054c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.361144 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-client-ca" (OuterVolumeSpecName: "client-ca") pod "2e226a30-c23d-4a45-ab06-4087bf0a38c7" (UID: "2e226a30-c23d-4a45-ab06-4087bf0a38c7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.362053 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-config" (OuterVolumeSpecName: "config") pod "2e226a30-c23d-4a45-ab06-4087bf0a38c7" (UID: "2e226a30-c23d-4a45-ab06-4087bf0a38c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.365434 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c56aba9c-2ad5-4635-b6ad-eac6f79054c3" (UID: "c56aba9c-2ad5-4635-b6ad-eac6f79054c3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.365614 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e226a30-c23d-4a45-ab06-4087bf0a38c7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2e226a30-c23d-4a45-ab06-4087bf0a38c7" (UID: "2e226a30-c23d-4a45-ab06-4087bf0a38c7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.365624 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-kube-api-access-dngsm" (OuterVolumeSpecName: "kube-api-access-dngsm") pod "c56aba9c-2ad5-4635-b6ad-eac6f79054c3" (UID: "c56aba9c-2ad5-4635-b6ad-eac6f79054c3"). InnerVolumeSpecName "kube-api-access-dngsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.366220 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e226a30-c23d-4a45-ab06-4087bf0a38c7-kube-api-access-lssj5" (OuterVolumeSpecName: "kube-api-access-lssj5") pod "2e226a30-c23d-4a45-ab06-4087bf0a38c7" (UID: "2e226a30-c23d-4a45-ab06-4087bf0a38c7"). InnerVolumeSpecName "kube-api-access-lssj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.374767 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.430840 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.453331 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.461038 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.461076 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.461087 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.461096 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dngsm\" (UniqueName: \"kubernetes.io/projected/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-kube-api-access-dngsm\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.461109 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e226a30-c23d-4a45-ab06-4087bf0a38c7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.461118 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lssj5\" (UniqueName: \"kubernetes.io/projected/2e226a30-c23d-4a45-ab06-4087bf0a38c7-kube-api-access-lssj5\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.461127 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e226a30-c23d-4a45-ab06-4087bf0a38c7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.461137 4839 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.461146 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c56aba9c-2ad5-4635-b6ad-eac6f79054c3-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.499490 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.525852 4839 generic.go:334] "Generic (PLEG): container finished" podID="c56aba9c-2ad5-4635-b6ad-eac6f79054c3" containerID="42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf" exitCode=0 Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.526218 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" event={"ID":"c56aba9c-2ad5-4635-b6ad-eac6f79054c3","Type":"ContainerDied","Data":"42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf"} Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.526417 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" event={"ID":"c56aba9c-2ad5-4635-b6ad-eac6f79054c3","Type":"ContainerDied","Data":"d7388d39aa83eaf8d537d9f043477f7e68a9f0a64d46f47372fa63ac019341e1"} Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.526303 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-558f576774-7vr74" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.526461 4839 scope.go:117] "RemoveContainer" containerID="42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.528225 4839 generic.go:334] "Generic (PLEG): container finished" podID="2e226a30-c23d-4a45-ab06-4087bf0a38c7" containerID="ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c" exitCode=0 Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.528272 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" event={"ID":"2e226a30-c23d-4a45-ab06-4087bf0a38c7","Type":"ContainerDied","Data":"ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c"} Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.528303 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" event={"ID":"2e226a30-c23d-4a45-ab06-4087bf0a38c7","Type":"ContainerDied","Data":"13aade652ede3403d6b69f04e499915fd3869f0f1415b459504a8ea3e6cac5bf"} Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.528356 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.552644 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.554454 4839 scope.go:117] "RemoveContainer" containerID="42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf" Mar 21 04:29:27 crc kubenswrapper[4839]: E0321 04:29:27.555393 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf\": container with ID starting with 42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf not found: ID does not exist" containerID="42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.555439 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf"} err="failed to get container status \"42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf\": rpc error: code = NotFound desc = could not find container \"42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf\": container with ID starting with 42a603ec98d84a05b1f20eead303786e9ed6177eea3a7880147775cb99246cdf not found: ID does not exist" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.555465 4839 scope.go:117] "RemoveContainer" containerID="ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.560662 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-558f576774-7vr74"] Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.564729 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-558f576774-7vr74"] Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.571062 4839 scope.go:117] "RemoveContainer" containerID="ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c" Mar 21 04:29:27 crc kubenswrapper[4839]: E0321 04:29:27.571524 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c\": container with ID starting with ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c not found: ID does not exist" containerID="ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.571658 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c"} err="failed to get container status \"ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c\": rpc error: code = NotFound desc = could not find container \"ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c\": container with ID starting with ebf0cee149fa25552dccfb87a894d627508cabce1c44155d596f5ae2612b7b6c not found: ID does not exist" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.573400 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25"] Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.577092 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff7874d5d-7kj25"] Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.694287 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.696381 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.764752 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.799551 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.799621 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.800132 4839 scope.go:117] "RemoveContainer" containerID="6dd6effaf30cd6d83c10fd64f9da93bee36a19a8a87078b69af4038ae06980b0" Mar 21 04:29:27 crc kubenswrapper[4839]: E0321 04:29:27.800297 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-cb949b455-jr2d6_openshift-authentication(02d1828c-4b4b-4d6e-994e-d1b383763960)\"" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" podUID="02d1828c-4b4b-4d6e-994e-d1b383763960" Mar 21 04:29:27 crc kubenswrapper[4839]: I0321 04:29:27.994909 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.007368 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.075151 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.282929 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.458736 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e226a30-c23d-4a45-ab06-4087bf0a38c7" path="/var/lib/kubelet/pods/2e226a30-c23d-4a45-ab06-4087bf0a38c7/volumes" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.459277 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c56aba9c-2ad5-4635-b6ad-eac6f79054c3" path="/var/lib/kubelet/pods/c56aba9c-2ad5-4635-b6ad-eac6f79054c3/volumes" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.485591 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75b8f8489-bxmhw"] Mar 21 04:29:28 crc kubenswrapper[4839]: E0321 04:29:28.485792 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.485804 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 21 04:29:28 crc kubenswrapper[4839]: E0321 04:29:28.485813 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e226a30-c23d-4a45-ab06-4087bf0a38c7" containerName="route-controller-manager" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.485818 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e226a30-c23d-4a45-ab06-4087bf0a38c7" containerName="route-controller-manager" Mar 21 04:29:28 crc kubenswrapper[4839]: E0321 04:29:28.485831 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56aba9c-2ad5-4635-b6ad-eac6f79054c3" containerName="controller-manager" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.485837 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56aba9c-2ad5-4635-b6ad-eac6f79054c3" containerName="controller-manager" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.485920 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e226a30-c23d-4a45-ab06-4087bf0a38c7" containerName="route-controller-manager" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.485930 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="c56aba9c-2ad5-4635-b6ad-eac6f79054c3" containerName="controller-manager" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.485940 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.486232 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.490948 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.491024 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.491072 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.491083 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.491442 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.492936 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr"] Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.499211 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.500203 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.508983 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.509245 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.509926 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.510961 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.511481 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.518238 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.519896 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.520168 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr"] Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.526546 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75b8f8489-bxmhw"] Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.628838 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.674754 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-config\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.674791 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecb0a5f1-f808-400a-a4c1-205733971f86-serving-cert\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.674815 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2h9k\" (UniqueName: \"kubernetes.io/projected/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-kube-api-access-k2h9k\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.674844 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-client-ca\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.674877 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-serving-cert\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.674895 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-config\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.674992 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2kdt\" (UniqueName: \"kubernetes.io/projected/ecb0a5f1-f808-400a-a4c1-205733971f86-kube-api-access-k2kdt\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.675117 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-proxy-ca-bundles\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.675187 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-client-ca\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.775677 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-client-ca\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.775785 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-config\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.775804 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecb0a5f1-f808-400a-a4c1-205733971f86-serving-cert\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.775839 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2h9k\" (UniqueName: \"kubernetes.io/projected/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-kube-api-access-k2h9k\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.775866 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-client-ca\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.775924 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-serving-cert\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.775942 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-config\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.775959 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2kdt\" (UniqueName: \"kubernetes.io/projected/ecb0a5f1-f808-400a-a4c1-205733971f86-kube-api-access-k2kdt\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.776165 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-proxy-ca-bundles\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.777459 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-client-ca\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.777809 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-config\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.777954 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-config\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.779814 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-client-ca\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.787912 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecb0a5f1-f808-400a-a4c1-205733971f86-serving-cert\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.790079 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-proxy-ca-bundles\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.792386 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2h9k\" (UniqueName: \"kubernetes.io/projected/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-kube-api-access-k2h9k\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.793388 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2kdt\" (UniqueName: \"kubernetes.io/projected/ecb0a5f1-f808-400a-a4c1-205733971f86-kube-api-access-k2kdt\") pod \"route-controller-manager-6698965c79-7hfxr\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.798334 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-serving-cert\") pod \"controller-manager-75b8f8489-bxmhw\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.820417 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:28 crc kubenswrapper[4839]: I0321 04:29:28.827467 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.019741 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr"] Mar 21 04:29:29 crc kubenswrapper[4839]: W0321 04:29:29.061541 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67bd8eb3_11ef_4a6e_a579_e5bddf00634f.slice/crio-0ca0196447a9f98003d964a1b6ae20c72e661ec09f53f87cf2c8a36e650639a9 WatchSource:0}: Error finding container 0ca0196447a9f98003d964a1b6ae20c72e661ec09f53f87cf2c8a36e650639a9: Status 404 returned error can't find the container with id 0ca0196447a9f98003d964a1b6ae20c72e661ec09f53f87cf2c8a36e650639a9 Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.064517 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75b8f8489-bxmhw"] Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.108680 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.240695 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.349560 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.546582 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.546642 4839 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="7087a9cc9ea0a71bf83c49dccad5914cb87b12f2c00337676a7c943aa8d8a9b6" exitCode=137 Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.555536 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" event={"ID":"67bd8eb3-11ef-4a6e-a579-e5bddf00634f","Type":"ContainerStarted","Data":"5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7"} Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.555619 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" event={"ID":"67bd8eb3-11ef-4a6e-a579-e5bddf00634f","Type":"ContainerStarted","Data":"0ca0196447a9f98003d964a1b6ae20c72e661ec09f53f87cf2c8a36e650639a9"} Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.555772 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.557791 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" event={"ID":"ecb0a5f1-f808-400a-a4c1-205733971f86","Type":"ContainerStarted","Data":"a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4"} Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.557839 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" event={"ID":"ecb0a5f1-f808-400a-a4c1-205733971f86","Type":"ContainerStarted","Data":"8d1f0128d103bab12910baaf83b8d5ee7c491413cb0a80553f2697a11324deab"} Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.558210 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.561498 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.562971 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.574717 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" podStartSLOduration=3.574699172 podStartE2EDuration="3.574699172s" podCreationTimestamp="2026-03-21 04:29:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:29:29.572993383 +0000 UTC m=+373.900780069" watchObservedRunningTime="2026-03-21 04:29:29.574699172 +0000 UTC m=+373.902485848" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.724388 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.724464 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.791692 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.791749 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.791814 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.791811 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.791857 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.791890 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.791903 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.791920 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.792012 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.792177 4839 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.792189 4839 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.792201 4839 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.792210 4839 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.799863 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.820533 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.892598 4839 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:29 crc kubenswrapper[4839]: I0321 04:29:29.945141 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 21 04:29:30 crc kubenswrapper[4839]: I0321 04:29:30.127233 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 21 04:29:30 crc kubenswrapper[4839]: I0321 04:29:30.468209 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 21 04:29:30 crc kubenswrapper[4839]: I0321 04:29:30.566210 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 21 04:29:30 crc kubenswrapper[4839]: I0321 04:29:30.566354 4839 scope.go:117] "RemoveContainer" containerID="7087a9cc9ea0a71bf83c49dccad5914cb87b12f2c00337676a7c943aa8d8a9b6" Mar 21 04:29:30 crc kubenswrapper[4839]: I0321 04:29:30.566498 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 21 04:29:41 crc kubenswrapper[4839]: I0321 04:29:41.452558 4839 scope.go:117] "RemoveContainer" containerID="6dd6effaf30cd6d83c10fd64f9da93bee36a19a8a87078b69af4038ae06980b0" Mar 21 04:29:42 crc kubenswrapper[4839]: I0321 04:29:42.626705 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-cb949b455-jr2d6_02d1828c-4b4b-4d6e-994e-d1b383763960/oauth-openshift/1.log" Mar 21 04:29:42 crc kubenswrapper[4839]: I0321 04:29:42.627027 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" event={"ID":"02d1828c-4b4b-4d6e-994e-d1b383763960","Type":"ContainerStarted","Data":"128cf1880264029b1df19f2d836e53b76d4f39d0c9758f4478220d81c2192166"} Mar 21 04:29:42 crc kubenswrapper[4839]: I0321 04:29:42.627351 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:42 crc kubenswrapper[4839]: I0321 04:29:42.632378 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" Mar 21 04:29:42 crc kubenswrapper[4839]: I0321 04:29:42.646662 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-cb949b455-jr2d6" podStartSLOduration=63.646643679 podStartE2EDuration="1m3.646643679s" podCreationTimestamp="2026-03-21 04:28:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:29:42.644737025 +0000 UTC m=+386.972523711" watchObservedRunningTime="2026-03-21 04:29:42.646643679 +0000 UTC m=+386.974430355" Mar 21 04:29:42 crc kubenswrapper[4839]: I0321 04:29:42.650149 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" podStartSLOduration=16.650139199 podStartE2EDuration="16.650139199s" podCreationTimestamp="2026-03-21 04:29:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:29:29.616086615 +0000 UTC m=+373.943873301" watchObservedRunningTime="2026-03-21 04:29:42.650139199 +0000 UTC m=+386.977925875" Mar 21 04:29:46 crc kubenswrapper[4839]: I0321 04:29:46.696943 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75b8f8489-bxmhw"] Mar 21 04:29:46 crc kubenswrapper[4839]: I0321 04:29:46.697457 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" podUID="67bd8eb3-11ef-4a6e-a579-e5bddf00634f" containerName="controller-manager" containerID="cri-o://5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7" gracePeriod=30 Mar 21 04:29:46 crc kubenswrapper[4839]: I0321 04:29:46.711189 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr"] Mar 21 04:29:46 crc kubenswrapper[4839]: I0321 04:29:46.711835 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" podUID="ecb0a5f1-f808-400a-a4c1-205733971f86" containerName="route-controller-manager" containerID="cri-o://a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4" gracePeriod=30 Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.224505 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.259345 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecb0a5f1-f808-400a-a4c1-205733971f86-serving-cert\") pod \"ecb0a5f1-f808-400a-a4c1-205733971f86\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.259470 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-config\") pod \"ecb0a5f1-f808-400a-a4c1-205733971f86\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.259520 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2kdt\" (UniqueName: \"kubernetes.io/projected/ecb0a5f1-f808-400a-a4c1-205733971f86-kube-api-access-k2kdt\") pod \"ecb0a5f1-f808-400a-a4c1-205733971f86\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.259554 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-client-ca\") pod \"ecb0a5f1-f808-400a-a4c1-205733971f86\" (UID: \"ecb0a5f1-f808-400a-a4c1-205733971f86\") " Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.260275 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-client-ca" (OuterVolumeSpecName: "client-ca") pod "ecb0a5f1-f808-400a-a4c1-205733971f86" (UID: "ecb0a5f1-f808-400a-a4c1-205733971f86"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.260329 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-config" (OuterVolumeSpecName: "config") pod "ecb0a5f1-f808-400a-a4c1-205733971f86" (UID: "ecb0a5f1-f808-400a-a4c1-205733971f86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.264806 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecb0a5f1-f808-400a-a4c1-205733971f86-kube-api-access-k2kdt" (OuterVolumeSpecName: "kube-api-access-k2kdt") pod "ecb0a5f1-f808-400a-a4c1-205733971f86" (UID: "ecb0a5f1-f808-400a-a4c1-205733971f86"). InnerVolumeSpecName "kube-api-access-k2kdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.266733 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecb0a5f1-f808-400a-a4c1-205733971f86-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ecb0a5f1-f808-400a-a4c1-205733971f86" (UID: "ecb0a5f1-f808-400a-a4c1-205733971f86"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.340850 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.360699 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecb0a5f1-f808-400a-a4c1-205733971f86-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.360738 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.360750 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2kdt\" (UniqueName: \"kubernetes.io/projected/ecb0a5f1-f808-400a-a4c1-205733971f86-kube-api-access-k2kdt\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.360762 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecb0a5f1-f808-400a-a4c1-205733971f86-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.461098 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-client-ca\") pod \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.461145 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-proxy-ca-bundles\") pod \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.461189 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2h9k\" (UniqueName: \"kubernetes.io/projected/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-kube-api-access-k2h9k\") pod \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.461277 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-config\") pod \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.461326 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-serving-cert\") pod \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\" (UID: \"67bd8eb3-11ef-4a6e-a579-e5bddf00634f\") " Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.461956 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-client-ca" (OuterVolumeSpecName: "client-ca") pod "67bd8eb3-11ef-4a6e-a579-e5bddf00634f" (UID: "67bd8eb3-11ef-4a6e-a579-e5bddf00634f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.462023 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-config" (OuterVolumeSpecName: "config") pod "67bd8eb3-11ef-4a6e-a579-e5bddf00634f" (UID: "67bd8eb3-11ef-4a6e-a579-e5bddf00634f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.462037 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "67bd8eb3-11ef-4a6e-a579-e5bddf00634f" (UID: "67bd8eb3-11ef-4a6e-a579-e5bddf00634f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.464161 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "67bd8eb3-11ef-4a6e-a579-e5bddf00634f" (UID: "67bd8eb3-11ef-4a6e-a579-e5bddf00634f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.464679 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-kube-api-access-k2h9k" (OuterVolumeSpecName: "kube-api-access-k2h9k") pod "67bd8eb3-11ef-4a6e-a579-e5bddf00634f" (UID: "67bd8eb3-11ef-4a6e-a579-e5bddf00634f"). InnerVolumeSpecName "kube-api-access-k2h9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.562407 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.562441 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.562449 4839 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.562460 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2h9k\" (UniqueName: \"kubernetes.io/projected/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-kube-api-access-k2h9k\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.562468 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67bd8eb3-11ef-4a6e-a579-e5bddf00634f-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.652971 4839 generic.go:334] "Generic (PLEG): container finished" podID="67bd8eb3-11ef-4a6e-a579-e5bddf00634f" containerID="5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7" exitCode=0 Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.653039 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" event={"ID":"67bd8eb3-11ef-4a6e-a579-e5bddf00634f","Type":"ContainerDied","Data":"5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7"} Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.653066 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" event={"ID":"67bd8eb3-11ef-4a6e-a579-e5bddf00634f","Type":"ContainerDied","Data":"0ca0196447a9f98003d964a1b6ae20c72e661ec09f53f87cf2c8a36e650639a9"} Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.653084 4839 scope.go:117] "RemoveContainer" containerID="5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.653173 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75b8f8489-bxmhw" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.655378 4839 generic.go:334] "Generic (PLEG): container finished" podID="ecb0a5f1-f808-400a-a4c1-205733971f86" containerID="a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4" exitCode=0 Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.655412 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.655408 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" event={"ID":"ecb0a5f1-f808-400a-a4c1-205733971f86","Type":"ContainerDied","Data":"a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4"} Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.655459 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr" event={"ID":"ecb0a5f1-f808-400a-a4c1-205733971f86","Type":"ContainerDied","Data":"8d1f0128d103bab12910baaf83b8d5ee7c491413cb0a80553f2697a11324deab"} Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.666927 4839 scope.go:117] "RemoveContainer" containerID="5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7" Mar 21 04:29:47 crc kubenswrapper[4839]: E0321 04:29:47.667265 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7\": container with ID starting with 5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7 not found: ID does not exist" containerID="5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.667298 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7"} err="failed to get container status \"5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7\": rpc error: code = NotFound desc = could not find container \"5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7\": container with ID starting with 5489888f82116fb94075e04cc085c30514d0b66b268ad7293d4d847b8ea20bd7 not found: ID does not exist" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.667321 4839 scope.go:117] "RemoveContainer" containerID="a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.680485 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr"] Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.681652 4839 scope.go:117] "RemoveContainer" containerID="a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4" Mar 21 04:29:47 crc kubenswrapper[4839]: E0321 04:29:47.681982 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4\": container with ID starting with a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4 not found: ID does not exist" containerID="a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.682008 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4"} err="failed to get container status \"a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4\": rpc error: code = NotFound desc = could not find container \"a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4\": container with ID starting with a56ed0e167d915e0d9ff351ce8c17a19ff7a39b536f07fee566f40ea9ea7e9f4 not found: ID does not exist" Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.683710 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6698965c79-7hfxr"] Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.690447 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75b8f8489-bxmhw"] Mar 21 04:29:47 crc kubenswrapper[4839]: I0321 04:29:47.693291 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-75b8f8489-bxmhw"] Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.459206 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67bd8eb3-11ef-4a6e-a579-e5bddf00634f" path="/var/lib/kubelet/pods/67bd8eb3-11ef-4a6e-a579-e5bddf00634f/volumes" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.459907 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecb0a5f1-f808-400a-a4c1-205733971f86" path="/var/lib/kubelet/pods/ecb0a5f1-f808-400a-a4c1-205733971f86/volumes" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.503827 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5db558bd57-lhzm8"] Mar 21 04:29:48 crc kubenswrapper[4839]: E0321 04:29:48.504159 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecb0a5f1-f808-400a-a4c1-205733971f86" containerName="route-controller-manager" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.504178 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecb0a5f1-f808-400a-a4c1-205733971f86" containerName="route-controller-manager" Mar 21 04:29:48 crc kubenswrapper[4839]: E0321 04:29:48.504209 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67bd8eb3-11ef-4a6e-a579-e5bddf00634f" containerName="controller-manager" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.504216 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="67bd8eb3-11ef-4a6e-a579-e5bddf00634f" containerName="controller-manager" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.504368 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecb0a5f1-f808-400a-a4c1-205733971f86" containerName="route-controller-manager" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.504389 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="67bd8eb3-11ef-4a6e-a579-e5bddf00634f" containerName="controller-manager" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.505022 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.507189 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p"] Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.507591 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.507644 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.507776 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.507820 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.507925 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.507977 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.508049 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.513932 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.514113 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.514234 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.514318 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.514327 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.515219 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5db558bd57-lhzm8"] Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.517012 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.525305 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.525517 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p"] Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.572972 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-config\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.573030 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz559\" (UniqueName: \"kubernetes.io/projected/690da0e8-bbb6-43cd-a875-01057cb5c75c-kube-api-access-bz559\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.573057 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-client-ca\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.573102 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxlzk\" (UniqueName: \"kubernetes.io/projected/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-kube-api-access-hxlzk\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.573132 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-proxy-ca-bundles\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.573348 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-client-ca\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.573514 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-serving-cert\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.573644 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-config\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.573712 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/690da0e8-bbb6-43cd-a875-01057cb5c75c-serving-cert\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.675251 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-client-ca\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.675364 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-serving-cert\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.675426 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-config\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.675459 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/690da0e8-bbb6-43cd-a875-01057cb5c75c-serving-cert\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.675509 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-config\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.675546 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz559\" (UniqueName: \"kubernetes.io/projected/690da0e8-bbb6-43cd-a875-01057cb5c75c-kube-api-access-bz559\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.675619 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-client-ca\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.675725 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxlzk\" (UniqueName: \"kubernetes.io/projected/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-kube-api-access-hxlzk\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.675766 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-proxy-ca-bundles\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.676828 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-client-ca\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.676881 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-client-ca\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.677483 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-config\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.677526 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-config\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.678515 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-proxy-ca-bundles\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.680478 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-serving-cert\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.684998 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/690da0e8-bbb6-43cd-a875-01057cb5c75c-serving-cert\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.694170 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxlzk\" (UniqueName: \"kubernetes.io/projected/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-kube-api-access-hxlzk\") pod \"route-controller-manager-69c79dd4cc-nhr5p\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.696463 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz559\" (UniqueName: \"kubernetes.io/projected/690da0e8-bbb6-43cd-a875-01057cb5c75c-kube-api-access-bz559\") pod \"controller-manager-5db558bd57-lhzm8\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.827037 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:48 crc kubenswrapper[4839]: I0321 04:29:48.838223 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.068232 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p"] Mar 21 04:29:49 crc kubenswrapper[4839]: W0321 04:29:49.073226 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ec6a4a0_0142_4e96_8bb1_4bd5592708fb.slice/crio-6801db09bbefa1710476d24d618a489257ebc6c5c6d0190d1048c0d66c2a111e WatchSource:0}: Error finding container 6801db09bbefa1710476d24d618a489257ebc6c5c6d0190d1048c0d66c2a111e: Status 404 returned error can't find the container with id 6801db09bbefa1710476d24d618a489257ebc6c5c6d0190d1048c0d66c2a111e Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.253691 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5db558bd57-lhzm8"] Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.669930 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" event={"ID":"690da0e8-bbb6-43cd-a875-01057cb5c75c","Type":"ContainerStarted","Data":"30c0a19b4b15271d935501a59bd059db3ee741da807e38461cc3b414e2fd9707"} Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.669980 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" event={"ID":"690da0e8-bbb6-43cd-a875-01057cb5c75c","Type":"ContainerStarted","Data":"d7e1d1749d2f8c80b6cb4501bf2388e3ae20d7bb2af4e3b44e88a047b92a941b"} Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.671175 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.673477 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" event={"ID":"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb","Type":"ContainerStarted","Data":"cb1e5111f57ecec9d54deaa24ac7b3e3895324ed67a482c44d209c0b8560c6bc"} Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.673509 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" event={"ID":"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb","Type":"ContainerStarted","Data":"6801db09bbefa1710476d24d618a489257ebc6c5c6d0190d1048c0d66c2a111e"} Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.673721 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.677770 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.679214 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.692325 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" podStartSLOduration=3.692307288 podStartE2EDuration="3.692307288s" podCreationTimestamp="2026-03-21 04:29:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:29:49.690530169 +0000 UTC m=+394.018316855" watchObservedRunningTime="2026-03-21 04:29:49.692307288 +0000 UTC m=+394.020093964" Mar 21 04:29:49 crc kubenswrapper[4839]: I0321 04:29:49.727800 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" podStartSLOduration=3.7277851699999998 podStartE2EDuration="3.72778517s" podCreationTimestamp="2026-03-21 04:29:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:29:49.725167448 +0000 UTC m=+394.052954124" watchObservedRunningTime="2026-03-21 04:29:49.72778517 +0000 UTC m=+394.055571846" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.157785 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr"] Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.158956 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.160227 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567790-h7nhz"] Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.160903 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567790-h7nhz" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.161116 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.161289 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.162355 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.163275 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.163406 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.165080 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr"] Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.183548 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567790-h7nhz"] Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.294395 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk4tn\" (UniqueName: \"kubernetes.io/projected/64e6efc9-03ce-4af4-bcc2-bc64ceebc652-kube-api-access-vk4tn\") pod \"auto-csr-approver-29567790-h7nhz\" (UID: \"64e6efc9-03ce-4af4-bcc2-bc64ceebc652\") " pod="openshift-infra/auto-csr-approver-29567790-h7nhz" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.294503 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-config-volume\") pod \"collect-profiles-29567790-knjwr\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.294546 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-secret-volume\") pod \"collect-profiles-29567790-knjwr\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.294602 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8792q\" (UniqueName: \"kubernetes.io/projected/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-kube-api-access-8792q\") pod \"collect-profiles-29567790-knjwr\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.395633 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk4tn\" (UniqueName: \"kubernetes.io/projected/64e6efc9-03ce-4af4-bcc2-bc64ceebc652-kube-api-access-vk4tn\") pod \"auto-csr-approver-29567790-h7nhz\" (UID: \"64e6efc9-03ce-4af4-bcc2-bc64ceebc652\") " pod="openshift-infra/auto-csr-approver-29567790-h7nhz" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.395752 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-config-volume\") pod \"collect-profiles-29567790-knjwr\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.395796 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-secret-volume\") pod \"collect-profiles-29567790-knjwr\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.395830 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8792q\" (UniqueName: \"kubernetes.io/projected/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-kube-api-access-8792q\") pod \"collect-profiles-29567790-knjwr\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.398231 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-config-volume\") pod \"collect-profiles-29567790-knjwr\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.407358 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-secret-volume\") pod \"collect-profiles-29567790-knjwr\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.411835 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8792q\" (UniqueName: \"kubernetes.io/projected/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-kube-api-access-8792q\") pod \"collect-profiles-29567790-knjwr\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.412261 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk4tn\" (UniqueName: \"kubernetes.io/projected/64e6efc9-03ce-4af4-bcc2-bc64ceebc652-kube-api-access-vk4tn\") pod \"auto-csr-approver-29567790-h7nhz\" (UID: \"64e6efc9-03ce-4af4-bcc2-bc64ceebc652\") " pod="openshift-infra/auto-csr-approver-29567790-h7nhz" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.480819 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.488903 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567790-h7nhz" Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.921654 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr"] Mar 21 04:30:00 crc kubenswrapper[4839]: I0321 04:30:00.930315 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567790-h7nhz"] Mar 21 04:30:00 crc kubenswrapper[4839]: W0321 04:30:00.931157 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fd65835_5c51_49a6_8e2f_9ac9569c2c64.slice/crio-72b46ac923acc04590d465b45e540fcefcfcdaf3bfd4b6ec30526ce0c31dc4d0 WatchSource:0}: Error finding container 72b46ac923acc04590d465b45e540fcefcfcdaf3bfd4b6ec30526ce0c31dc4d0: Status 404 returned error can't find the container with id 72b46ac923acc04590d465b45e540fcefcfcdaf3bfd4b6ec30526ce0c31dc4d0 Mar 21 04:30:00 crc kubenswrapper[4839]: W0321 04:30:00.935584 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64e6efc9_03ce_4af4_bcc2_bc64ceebc652.slice/crio-5b0eb2e2244b4707e2942990ef25d2d51082fbaa4744b1e9d21dec471bf648a0 WatchSource:0}: Error finding container 5b0eb2e2244b4707e2942990ef25d2d51082fbaa4744b1e9d21dec471bf648a0: Status 404 returned error can't find the container with id 5b0eb2e2244b4707e2942990ef25d2d51082fbaa4744b1e9d21dec471bf648a0 Mar 21 04:30:01 crc kubenswrapper[4839]: I0321 04:30:01.766711 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567790-h7nhz" event={"ID":"64e6efc9-03ce-4af4-bcc2-bc64ceebc652","Type":"ContainerStarted","Data":"5b0eb2e2244b4707e2942990ef25d2d51082fbaa4744b1e9d21dec471bf648a0"} Mar 21 04:30:01 crc kubenswrapper[4839]: I0321 04:30:01.769637 4839 generic.go:334] "Generic (PLEG): container finished" podID="0fd65835-5c51-49a6-8e2f-9ac9569c2c64" containerID="d1742f96e69ee0f8c2f73ccffb16323bc9bae63d20c55bd829c98946a612539f" exitCode=0 Mar 21 04:30:01 crc kubenswrapper[4839]: I0321 04:30:01.769684 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" event={"ID":"0fd65835-5c51-49a6-8e2f-9ac9569c2c64","Type":"ContainerDied","Data":"d1742f96e69ee0f8c2f73ccffb16323bc9bae63d20c55bd829c98946a612539f"} Mar 21 04:30:01 crc kubenswrapper[4839]: I0321 04:30:01.769710 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" event={"ID":"0fd65835-5c51-49a6-8e2f-9ac9569c2c64","Type":"ContainerStarted","Data":"72b46ac923acc04590d465b45e540fcefcfcdaf3bfd4b6ec30526ce0c31dc4d0"} Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.204164 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.231219 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8792q\" (UniqueName: \"kubernetes.io/projected/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-kube-api-access-8792q\") pod \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.231877 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-secret-volume\") pod \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.231916 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-config-volume\") pod \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\" (UID: \"0fd65835-5c51-49a6-8e2f-9ac9569c2c64\") " Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.233109 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-config-volume" (OuterVolumeSpecName: "config-volume") pod "0fd65835-5c51-49a6-8e2f-9ac9569c2c64" (UID: "0fd65835-5c51-49a6-8e2f-9ac9569c2c64"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.237553 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0fd65835-5c51-49a6-8e2f-9ac9569c2c64" (UID: "0fd65835-5c51-49a6-8e2f-9ac9569c2c64"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.238703 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-kube-api-access-8792q" (OuterVolumeSpecName: "kube-api-access-8792q") pod "0fd65835-5c51-49a6-8e2f-9ac9569c2c64" (UID: "0fd65835-5c51-49a6-8e2f-9ac9569c2c64"). InnerVolumeSpecName "kube-api-access-8792q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.333046 4839 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.333078 4839 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.333088 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8792q\" (UniqueName: \"kubernetes.io/projected/0fd65835-5c51-49a6-8e2f-9ac9569c2c64-kube-api-access-8792q\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.781653 4839 generic.go:334] "Generic (PLEG): container finished" podID="64e6efc9-03ce-4af4-bcc2-bc64ceebc652" containerID="7a160fd6d3c601d634e7f0ddbce27e4379f3d0fc66482e35f835bfe3e44b6c2b" exitCode=0 Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.781711 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567790-h7nhz" event={"ID":"64e6efc9-03ce-4af4-bcc2-bc64ceebc652","Type":"ContainerDied","Data":"7a160fd6d3c601d634e7f0ddbce27e4379f3d0fc66482e35f835bfe3e44b6c2b"} Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.783505 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" event={"ID":"0fd65835-5c51-49a6-8e2f-9ac9569c2c64","Type":"ContainerDied","Data":"72b46ac923acc04590d465b45e540fcefcfcdaf3bfd4b6ec30526ce0c31dc4d0"} Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.783528 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72b46ac923acc04590d465b45e540fcefcfcdaf3bfd4b6ec30526ce0c31dc4d0" Mar 21 04:30:03 crc kubenswrapper[4839]: I0321 04:30:03.783586 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr" Mar 21 04:30:05 crc kubenswrapper[4839]: I0321 04:30:05.153433 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567790-h7nhz" Mar 21 04:30:05 crc kubenswrapper[4839]: I0321 04:30:05.154902 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk4tn\" (UniqueName: \"kubernetes.io/projected/64e6efc9-03ce-4af4-bcc2-bc64ceebc652-kube-api-access-vk4tn\") pod \"64e6efc9-03ce-4af4-bcc2-bc64ceebc652\" (UID: \"64e6efc9-03ce-4af4-bcc2-bc64ceebc652\") " Mar 21 04:30:05 crc kubenswrapper[4839]: I0321 04:30:05.159819 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64e6efc9-03ce-4af4-bcc2-bc64ceebc652-kube-api-access-vk4tn" (OuterVolumeSpecName: "kube-api-access-vk4tn") pod "64e6efc9-03ce-4af4-bcc2-bc64ceebc652" (UID: "64e6efc9-03ce-4af4-bcc2-bc64ceebc652"). InnerVolumeSpecName "kube-api-access-vk4tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:30:05 crc kubenswrapper[4839]: I0321 04:30:05.255691 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk4tn\" (UniqueName: \"kubernetes.io/projected/64e6efc9-03ce-4af4-bcc2-bc64ceebc652-kube-api-access-vk4tn\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:05 crc kubenswrapper[4839]: I0321 04:30:05.794625 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567790-h7nhz" event={"ID":"64e6efc9-03ce-4af4-bcc2-bc64ceebc652","Type":"ContainerDied","Data":"5b0eb2e2244b4707e2942990ef25d2d51082fbaa4744b1e9d21dec471bf648a0"} Mar 21 04:30:05 crc kubenswrapper[4839]: I0321 04:30:05.794656 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567790-h7nhz" Mar 21 04:30:05 crc kubenswrapper[4839]: I0321 04:30:05.794662 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b0eb2e2244b4707e2942990ef25d2d51082fbaa4744b1e9d21dec471bf648a0" Mar 21 04:30:26 crc kubenswrapper[4839]: I0321 04:30:26.678826 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5db558bd57-lhzm8"] Mar 21 04:30:26 crc kubenswrapper[4839]: I0321 04:30:26.679630 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" podUID="690da0e8-bbb6-43cd-a875-01057cb5c75c" containerName="controller-manager" containerID="cri-o://30c0a19b4b15271d935501a59bd059db3ee741da807e38461cc3b414e2fd9707" gracePeriod=30 Mar 21 04:30:26 crc kubenswrapper[4839]: I0321 04:30:26.903060 4839 generic.go:334] "Generic (PLEG): container finished" podID="690da0e8-bbb6-43cd-a875-01057cb5c75c" containerID="30c0a19b4b15271d935501a59bd059db3ee741da807e38461cc3b414e2fd9707" exitCode=0 Mar 21 04:30:26 crc kubenswrapper[4839]: I0321 04:30:26.903099 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" event={"ID":"690da0e8-bbb6-43cd-a875-01057cb5c75c","Type":"ContainerDied","Data":"30c0a19b4b15271d935501a59bd059db3ee741da807e38461cc3b414e2fd9707"} Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.160930 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.187930 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-config\") pod \"690da0e8-bbb6-43cd-a875-01057cb5c75c\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.188000 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-proxy-ca-bundles\") pod \"690da0e8-bbb6-43cd-a875-01057cb5c75c\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.188049 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz559\" (UniqueName: \"kubernetes.io/projected/690da0e8-bbb6-43cd-a875-01057cb5c75c-kube-api-access-bz559\") pod \"690da0e8-bbb6-43cd-a875-01057cb5c75c\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.188087 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/690da0e8-bbb6-43cd-a875-01057cb5c75c-serving-cert\") pod \"690da0e8-bbb6-43cd-a875-01057cb5c75c\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.188161 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-client-ca\") pod \"690da0e8-bbb6-43cd-a875-01057cb5c75c\" (UID: \"690da0e8-bbb6-43cd-a875-01057cb5c75c\") " Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.189303 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-client-ca" (OuterVolumeSpecName: "client-ca") pod "690da0e8-bbb6-43cd-a875-01057cb5c75c" (UID: "690da0e8-bbb6-43cd-a875-01057cb5c75c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.189320 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "690da0e8-bbb6-43cd-a875-01057cb5c75c" (UID: "690da0e8-bbb6-43cd-a875-01057cb5c75c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.189965 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-config" (OuterVolumeSpecName: "config") pod "690da0e8-bbb6-43cd-a875-01057cb5c75c" (UID: "690da0e8-bbb6-43cd-a875-01057cb5c75c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.194065 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/690da0e8-bbb6-43cd-a875-01057cb5c75c-kube-api-access-bz559" (OuterVolumeSpecName: "kube-api-access-bz559") pod "690da0e8-bbb6-43cd-a875-01057cb5c75c" (UID: "690da0e8-bbb6-43cd-a875-01057cb5c75c"). InnerVolumeSpecName "kube-api-access-bz559". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.194202 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/690da0e8-bbb6-43cd-a875-01057cb5c75c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "690da0e8-bbb6-43cd-a875-01057cb5c75c" (UID: "690da0e8-bbb6-43cd-a875-01057cb5c75c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.290157 4839 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.290211 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz559\" (UniqueName: \"kubernetes.io/projected/690da0e8-bbb6-43cd-a875-01057cb5c75c-kube-api-access-bz559\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.290239 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/690da0e8-bbb6-43cd-a875-01057cb5c75c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.290250 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.290260 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/690da0e8-bbb6-43cd-a875-01057cb5c75c-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.760133 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75b8f8489-pl6wn"] Mar 21 04:30:27 crc kubenswrapper[4839]: E0321 04:30:27.760714 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd65835-5c51-49a6-8e2f-9ac9569c2c64" containerName="collect-profiles" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.760730 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd65835-5c51-49a6-8e2f-9ac9569c2c64" containerName="collect-profiles" Mar 21 04:30:27 crc kubenswrapper[4839]: E0321 04:30:27.760740 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e6efc9-03ce-4af4-bcc2-bc64ceebc652" containerName="oc" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.760748 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e6efc9-03ce-4af4-bcc2-bc64ceebc652" containerName="oc" Mar 21 04:30:27 crc kubenswrapper[4839]: E0321 04:30:27.760906 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="690da0e8-bbb6-43cd-a875-01057cb5c75c" containerName="controller-manager" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.760916 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="690da0e8-bbb6-43cd-a875-01057cb5c75c" containerName="controller-manager" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.761053 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="690da0e8-bbb6-43cd-a875-01057cb5c75c" containerName="controller-manager" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.761065 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd65835-5c51-49a6-8e2f-9ac9569c2c64" containerName="collect-profiles" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.761083 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="64e6efc9-03ce-4af4-bcc2-bc64ceebc652" containerName="oc" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.761499 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.772424 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75b8f8489-pl6wn"] Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.795130 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b710421-2c24-4791-a561-846b4830b732-serving-cert\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.795177 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b710421-2c24-4791-a561-846b4830b732-config\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.795197 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b710421-2c24-4791-a561-846b4830b732-proxy-ca-bundles\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.795407 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b710421-2c24-4791-a561-846b4830b732-client-ca\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.795449 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z8cc\" (UniqueName: \"kubernetes.io/projected/7b710421-2c24-4791-a561-846b4830b732-kube-api-access-7z8cc\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.896849 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b710421-2c24-4791-a561-846b4830b732-client-ca\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.896893 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z8cc\" (UniqueName: \"kubernetes.io/projected/7b710421-2c24-4791-a561-846b4830b732-kube-api-access-7z8cc\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.896933 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b710421-2c24-4791-a561-846b4830b732-serving-cert\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.896955 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b710421-2c24-4791-a561-846b4830b732-config\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.896970 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b710421-2c24-4791-a561-846b4830b732-proxy-ca-bundles\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.898030 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b710421-2c24-4791-a561-846b4830b732-client-ca\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.898101 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b710421-2c24-4791-a561-846b4830b732-proxy-ca-bundles\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.900130 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b710421-2c24-4791-a561-846b4830b732-config\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.906233 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b710421-2c24-4791-a561-846b4830b732-serving-cert\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.911948 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" event={"ID":"690da0e8-bbb6-43cd-a875-01057cb5c75c","Type":"ContainerDied","Data":"d7e1d1749d2f8c80b6cb4501bf2388e3ae20d7bb2af4e3b44e88a047b92a941b"} Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.912254 4839 scope.go:117] "RemoveContainer" containerID="30c0a19b4b15271d935501a59bd059db3ee741da807e38461cc3b414e2fd9707" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.912657 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db558bd57-lhzm8" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.919299 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z8cc\" (UniqueName: \"kubernetes.io/projected/7b710421-2c24-4791-a561-846b4830b732-kube-api-access-7z8cc\") pod \"controller-manager-75b8f8489-pl6wn\" (UID: \"7b710421-2c24-4791-a561-846b4830b732\") " pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.977148 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5db558bd57-lhzm8"] Mar 21 04:30:27 crc kubenswrapper[4839]: I0321 04:30:27.980403 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5db558bd57-lhzm8"] Mar 21 04:30:28 crc kubenswrapper[4839]: I0321 04:30:28.091790 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:28 crc kubenswrapper[4839]: I0321 04:30:28.459065 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="690da0e8-bbb6-43cd-a875-01057cb5c75c" path="/var/lib/kubelet/pods/690da0e8-bbb6-43cd-a875-01057cb5c75c/volumes" Mar 21 04:30:28 crc kubenswrapper[4839]: I0321 04:30:28.487940 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75b8f8489-pl6wn"] Mar 21 04:30:28 crc kubenswrapper[4839]: I0321 04:30:28.920303 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" event={"ID":"7b710421-2c24-4791-a561-846b4830b732","Type":"ContainerStarted","Data":"7a77379278e6f9bb3b097e82d6b7770ec9bdb02d0153f713b8f21179945aa0a3"} Mar 21 04:30:28 crc kubenswrapper[4839]: I0321 04:30:28.920342 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" event={"ID":"7b710421-2c24-4791-a561-846b4830b732","Type":"ContainerStarted","Data":"1892b5ea14009b912efbf9b734e6dcf7bd4394bc469fa05c6907c7a1e8c51927"} Mar 21 04:30:28 crc kubenswrapper[4839]: I0321 04:30:28.920797 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:28 crc kubenswrapper[4839]: I0321 04:30:28.925145 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" Mar 21 04:30:28 crc kubenswrapper[4839]: I0321 04:30:28.935784 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75b8f8489-pl6wn" podStartSLOduration=2.935765214 podStartE2EDuration="2.935765214s" podCreationTimestamp="2026-03-21 04:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:30:28.934247052 +0000 UTC m=+433.262033768" watchObservedRunningTime="2026-03-21 04:30:28.935765214 +0000 UTC m=+433.263551890" Mar 21 04:30:46 crc kubenswrapper[4839]: I0321 04:30:46.677173 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p"] Mar 21 04:30:46 crc kubenswrapper[4839]: I0321 04:30:46.679410 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" podUID="3ec6a4a0-0142-4e96-8bb1-4bd5592708fb" containerName="route-controller-manager" containerID="cri-o://cb1e5111f57ecec9d54deaa24ac7b3e3895324ed67a482c44d209c0b8560c6bc" gracePeriod=30 Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.015424 4839 generic.go:334] "Generic (PLEG): container finished" podID="3ec6a4a0-0142-4e96-8bb1-4bd5592708fb" containerID="cb1e5111f57ecec9d54deaa24ac7b3e3895324ed67a482c44d209c0b8560c6bc" exitCode=0 Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.015666 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" event={"ID":"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb","Type":"ContainerDied","Data":"cb1e5111f57ecec9d54deaa24ac7b3e3895324ed67a482c44d209c0b8560c6bc"} Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.135025 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.155163 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxlzk\" (UniqueName: \"kubernetes.io/projected/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-kube-api-access-hxlzk\") pod \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.155207 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-serving-cert\") pod \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.155261 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-config\") pod \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.155295 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-client-ca\") pod \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\" (UID: \"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb\") " Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.156133 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-client-ca" (OuterVolumeSpecName: "client-ca") pod "3ec6a4a0-0142-4e96-8bb1-4bd5592708fb" (UID: "3ec6a4a0-0142-4e96-8bb1-4bd5592708fb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.156436 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-config" (OuterVolumeSpecName: "config") pod "3ec6a4a0-0142-4e96-8bb1-4bd5592708fb" (UID: "3ec6a4a0-0142-4e96-8bb1-4bd5592708fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.161352 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-kube-api-access-hxlzk" (OuterVolumeSpecName: "kube-api-access-hxlzk") pod "3ec6a4a0-0142-4e96-8bb1-4bd5592708fb" (UID: "3ec6a4a0-0142-4e96-8bb1-4bd5592708fb"). InnerVolumeSpecName "kube-api-access-hxlzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.161698 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3ec6a4a0-0142-4e96-8bb1-4bd5592708fb" (UID: "3ec6a4a0-0142-4e96-8bb1-4bd5592708fb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.256463 4839 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.256497 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.256506 4839 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.256514 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxlzk\" (UniqueName: \"kubernetes.io/projected/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb-kube-api-access-hxlzk\") on node \"crc\" DevicePath \"\"" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.769926 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb"] Mar 21 04:30:47 crc kubenswrapper[4839]: E0321 04:30:47.770158 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ec6a4a0-0142-4e96-8bb1-4bd5592708fb" containerName="route-controller-manager" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.770173 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ec6a4a0-0142-4e96-8bb1-4bd5592708fb" containerName="route-controller-manager" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.770294 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ec6a4a0-0142-4e96-8bb1-4bd5592708fb" containerName="route-controller-manager" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.770707 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.781839 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb"] Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.881488 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e01f7098-c6cd-4537-b145-c7090c45f92c-config\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.881592 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwxrk\" (UniqueName: \"kubernetes.io/projected/e01f7098-c6cd-4537-b145-c7090c45f92c-kube-api-access-jwxrk\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.881662 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e01f7098-c6cd-4537-b145-c7090c45f92c-client-ca\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.881696 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e01f7098-c6cd-4537-b145-c7090c45f92c-serving-cert\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.982444 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e01f7098-c6cd-4537-b145-c7090c45f92c-client-ca\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.982529 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e01f7098-c6cd-4537-b145-c7090c45f92c-serving-cert\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.982627 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e01f7098-c6cd-4537-b145-c7090c45f92c-config\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.982670 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwxrk\" (UniqueName: \"kubernetes.io/projected/e01f7098-c6cd-4537-b145-c7090c45f92c-kube-api-access-jwxrk\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.983541 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e01f7098-c6cd-4537-b145-c7090c45f92c-client-ca\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.984860 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e01f7098-c6cd-4537-b145-c7090c45f92c-config\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:47 crc kubenswrapper[4839]: I0321 04:30:47.989695 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e01f7098-c6cd-4537-b145-c7090c45f92c-serving-cert\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:48 crc kubenswrapper[4839]: I0321 04:30:48.004218 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwxrk\" (UniqueName: \"kubernetes.io/projected/e01f7098-c6cd-4537-b145-c7090c45f92c-kube-api-access-jwxrk\") pod \"route-controller-manager-6698965c79-mwntb\" (UID: \"e01f7098-c6cd-4537-b145-c7090c45f92c\") " pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:48 crc kubenswrapper[4839]: I0321 04:30:48.025398 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" event={"ID":"3ec6a4a0-0142-4e96-8bb1-4bd5592708fb","Type":"ContainerDied","Data":"6801db09bbefa1710476d24d618a489257ebc6c5c6d0190d1048c0d66c2a111e"} Mar 21 04:30:48 crc kubenswrapper[4839]: I0321 04:30:48.025847 4839 scope.go:117] "RemoveContainer" containerID="cb1e5111f57ecec9d54deaa24ac7b3e3895324ed67a482c44d209c0b8560c6bc" Mar 21 04:30:48 crc kubenswrapper[4839]: I0321 04:30:48.028762 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p" Mar 21 04:30:48 crc kubenswrapper[4839]: I0321 04:30:48.066888 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p"] Mar 21 04:30:48 crc kubenswrapper[4839]: I0321 04:30:48.069820 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c79dd4cc-nhr5p"] Mar 21 04:30:48 crc kubenswrapper[4839]: I0321 04:30:48.090009 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:48 crc kubenswrapper[4839]: I0321 04:30:48.460165 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ec6a4a0-0142-4e96-8bb1-4bd5592708fb" path="/var/lib/kubelet/pods/3ec6a4a0-0142-4e96-8bb1-4bd5592708fb/volumes" Mar 21 04:30:48 crc kubenswrapper[4839]: I0321 04:30:48.545440 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb"] Mar 21 04:30:48 crc kubenswrapper[4839]: W0321 04:30:48.551385 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode01f7098_c6cd_4537_b145_c7090c45f92c.slice/crio-4b96e27d02dd51feef2f2f1520f0ec60d86838746f0af8378dae40943e81b2d0 WatchSource:0}: Error finding container 4b96e27d02dd51feef2f2f1520f0ec60d86838746f0af8378dae40943e81b2d0: Status 404 returned error can't find the container with id 4b96e27d02dd51feef2f2f1520f0ec60d86838746f0af8378dae40943e81b2d0 Mar 21 04:30:49 crc kubenswrapper[4839]: I0321 04:30:49.033247 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" event={"ID":"e01f7098-c6cd-4537-b145-c7090c45f92c","Type":"ContainerStarted","Data":"e84ec08bf925ff9dc30f0b6708dd111cd670a713a1eab3bcc8ca98041173de0c"} Mar 21 04:30:49 crc kubenswrapper[4839]: I0321 04:30:49.033323 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" event={"ID":"e01f7098-c6cd-4537-b145-c7090c45f92c","Type":"ContainerStarted","Data":"4b96e27d02dd51feef2f2f1520f0ec60d86838746f0af8378dae40943e81b2d0"} Mar 21 04:30:49 crc kubenswrapper[4839]: I0321 04:30:49.033506 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:49 crc kubenswrapper[4839]: I0321 04:30:49.039853 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" Mar 21 04:30:49 crc kubenswrapper[4839]: I0321 04:30:49.057808 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6698965c79-mwntb" podStartSLOduration=3.05778756 podStartE2EDuration="3.05778756s" podCreationTimestamp="2026-03-21 04:30:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:30:49.053493093 +0000 UTC m=+453.381279769" watchObservedRunningTime="2026-03-21 04:30:49.05778756 +0000 UTC m=+453.385574236" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.678754 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dq7r2"] Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.679710 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.691097 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dq7r2"] Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.848349 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f38bcb76-a7ac-4c9c-8113-82113a818347-trusted-ca\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.848836 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.849004 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f38bcb76-a7ac-4c9c-8113-82113a818347-registry-tls\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.849137 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f38bcb76-a7ac-4c9c-8113-82113a818347-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.849264 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdcr5\" (UniqueName: \"kubernetes.io/projected/f38bcb76-a7ac-4c9c-8113-82113a818347-kube-api-access-mdcr5\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.849390 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f38bcb76-a7ac-4c9c-8113-82113a818347-bound-sa-token\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.849480 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f38bcb76-a7ac-4c9c-8113-82113a818347-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.849594 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f38bcb76-a7ac-4c9c-8113-82113a818347-registry-certificates\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.884648 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.952519 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f38bcb76-a7ac-4c9c-8113-82113a818347-registry-tls\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.952641 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f38bcb76-a7ac-4c9c-8113-82113a818347-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.952707 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdcr5\" (UniqueName: \"kubernetes.io/projected/f38bcb76-a7ac-4c9c-8113-82113a818347-kube-api-access-mdcr5\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.952762 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f38bcb76-a7ac-4c9c-8113-82113a818347-bound-sa-token\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.952780 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f38bcb76-a7ac-4c9c-8113-82113a818347-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.952803 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f38bcb76-a7ac-4c9c-8113-82113a818347-registry-certificates\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.952841 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f38bcb76-a7ac-4c9c-8113-82113a818347-trusted-ca\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.953772 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f38bcb76-a7ac-4c9c-8113-82113a818347-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.955527 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f38bcb76-a7ac-4c9c-8113-82113a818347-registry-certificates\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.955864 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f38bcb76-a7ac-4c9c-8113-82113a818347-trusted-ca\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.960319 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f38bcb76-a7ac-4c9c-8113-82113a818347-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.960362 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f38bcb76-a7ac-4c9c-8113-82113a818347-registry-tls\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.971945 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f38bcb76-a7ac-4c9c-8113-82113a818347-bound-sa-token\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:50 crc kubenswrapper[4839]: I0321 04:30:50.978215 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdcr5\" (UniqueName: \"kubernetes.io/projected/f38bcb76-a7ac-4c9c-8113-82113a818347-kube-api-access-mdcr5\") pod \"image-registry-66df7c8f76-dq7r2\" (UID: \"f38bcb76-a7ac-4c9c-8113-82113a818347\") " pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:51 crc kubenswrapper[4839]: I0321 04:30:51.007232 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:51 crc kubenswrapper[4839]: I0321 04:30:51.467637 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dq7r2"] Mar 21 04:30:51 crc kubenswrapper[4839]: W0321 04:30:51.478745 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf38bcb76_a7ac_4c9c_8113_82113a818347.slice/crio-dff358d0ff2e550e651ae5623bc54671cd1bf5226486d1ce469fbae73b838e60 WatchSource:0}: Error finding container dff358d0ff2e550e651ae5623bc54671cd1bf5226486d1ce469fbae73b838e60: Status 404 returned error can't find the container with id dff358d0ff2e550e651ae5623bc54671cd1bf5226486d1ce469fbae73b838e60 Mar 21 04:30:52 crc kubenswrapper[4839]: I0321 04:30:52.053148 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" event={"ID":"f38bcb76-a7ac-4c9c-8113-82113a818347","Type":"ContainerStarted","Data":"85f0b2b796ead4669125a67d51a2d115844b73c64d19e425a155a86e90f916b2"} Mar 21 04:30:52 crc kubenswrapper[4839]: I0321 04:30:52.053203 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" event={"ID":"f38bcb76-a7ac-4c9c-8113-82113a818347","Type":"ContainerStarted","Data":"dff358d0ff2e550e651ae5623bc54671cd1bf5226486d1ce469fbae73b838e60"} Mar 21 04:30:52 crc kubenswrapper[4839]: I0321 04:30:52.053388 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:30:52 crc kubenswrapper[4839]: I0321 04:30:52.081949 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" podStartSLOduration=2.081919896 podStartE2EDuration="2.081919896s" podCreationTimestamp="2026-03-21 04:30:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:30:52.07584699 +0000 UTC m=+456.403633696" watchObservedRunningTime="2026-03-21 04:30:52.081919896 +0000 UTC m=+456.409706592" Mar 21 04:30:54 crc kubenswrapper[4839]: I0321 04:30:54.502077 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:30:54 crc kubenswrapper[4839]: I0321 04:30:54.502552 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:30:54 crc kubenswrapper[4839]: I0321 04:30:54.503818 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:30:54 crc kubenswrapper[4839]: I0321 04:30:54.509386 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:30:54 crc kubenswrapper[4839]: I0321 04:30:54.554364 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 21 04:30:55 crc kubenswrapper[4839]: I0321 04:30:55.071318 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"745b558b217e5153570af89d9c999c0979ecc2ce0441d5449c2dd805d7fc01ec"} Mar 21 04:30:55 crc kubenswrapper[4839]: I0321 04:30:55.619934 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:30:55 crc kubenswrapper[4839]: I0321 04:30:55.620412 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:30:55 crc kubenswrapper[4839]: I0321 04:30:55.625928 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:30:55 crc kubenswrapper[4839]: I0321 04:30:55.627684 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:30:55 crc kubenswrapper[4839]: I0321 04:30:55.654036 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:30:55 crc kubenswrapper[4839]: I0321 04:30:55.755857 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 21 04:30:56 crc kubenswrapper[4839]: I0321 04:30:56.112207 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1f70fef8b391bd5f5c87fd8bd890dbf0a6c418cf851fa6fc6c48976086e26181"} Mar 21 04:30:56 crc kubenswrapper[4839]: W0321 04:30:56.171846 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-4dbf18e5e98e81f4493f6c492ee383c648403a00f68be58482999bfc91dbb223 WatchSource:0}: Error finding container 4dbf18e5e98e81f4493f6c492ee383c648403a00f68be58482999bfc91dbb223: Status 404 returned error can't find the container with id 4dbf18e5e98e81f4493f6c492ee383c648403a00f68be58482999bfc91dbb223 Mar 21 04:30:56 crc kubenswrapper[4839]: W0321 04:30:56.235627 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-29f8c2e191cc9bfaab8c57de58aba11a74b39306c987e0e9d89e5b71f8ddd8aa WatchSource:0}: Error finding container 29f8c2e191cc9bfaab8c57de58aba11a74b39306c987e0e9d89e5b71f8ddd8aa: Status 404 returned error can't find the container with id 29f8c2e191cc9bfaab8c57de58aba11a74b39306c987e0e9d89e5b71f8ddd8aa Mar 21 04:30:57 crc kubenswrapper[4839]: I0321 04:30:57.119037 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"02c21ed828f04bc82be2a8edc51d67c38b0d2bd2d9096daec5d2fd9d06d41f37"} Mar 21 04:30:57 crc kubenswrapper[4839]: I0321 04:30:57.119411 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"29f8c2e191cc9bfaab8c57de58aba11a74b39306c987e0e9d89e5b71f8ddd8aa"} Mar 21 04:30:57 crc kubenswrapper[4839]: I0321 04:30:57.123798 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"02a2a03d40793c60881217f7e9dae950ae3b7179c07747924d67f4ba362f51ce"} Mar 21 04:30:57 crc kubenswrapper[4839]: I0321 04:30:57.124146 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4dbf18e5e98e81f4493f6c492ee383c648403a00f68be58482999bfc91dbb223"} Mar 21 04:30:57 crc kubenswrapper[4839]: I0321 04:30:57.124305 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:31:00 crc kubenswrapper[4839]: I0321 04:31:00.980270 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:31:00 crc kubenswrapper[4839]: I0321 04:31:00.980527 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.258491 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nw7r6"] Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.258778 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nw7r6" podUID="65a571df-f531-458b-9aed-6de99e4607e1" containerName="registry-server" containerID="cri-o://3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4" gracePeriod=30 Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.269856 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxrc8"] Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.270117 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mxrc8" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerName="registry-server" containerID="cri-o://f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4" gracePeriod=30 Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.276767 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8jgh7"] Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.276965 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" podUID="6240548e-b827-4fdb-b2be-c7187d6a28e8" containerName="marketplace-operator" containerID="cri-o://10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9" gracePeriod=30 Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.288216 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qjgq"] Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.288443 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9qjgq" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerName="registry-server" containerID="cri-o://afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a" gracePeriod=30 Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.296250 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qb9bp"] Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.297034 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.309300 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zgfcm"] Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.309613 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zgfcm" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerName="registry-server" containerID="cri-o://e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817" gracePeriod=30 Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.325736 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qb9bp"] Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.490518 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df9bf95b-dc8f-4104-9c6c-873159393850-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qb9bp\" (UID: \"df9bf95b-dc8f-4104-9c6c-873159393850\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.490930 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/df9bf95b-dc8f-4104-9c6c-873159393850-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qb9bp\" (UID: \"df9bf95b-dc8f-4104-9c6c-873159393850\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.490979 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-628d5\" (UniqueName: \"kubernetes.io/projected/df9bf95b-dc8f-4104-9c6c-873159393850-kube-api-access-628d5\") pod \"marketplace-operator-79b997595-qb9bp\" (UID: \"df9bf95b-dc8f-4104-9c6c-873159393850\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.591868 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-628d5\" (UniqueName: \"kubernetes.io/projected/df9bf95b-dc8f-4104-9c6c-873159393850-kube-api-access-628d5\") pod \"marketplace-operator-79b997595-qb9bp\" (UID: \"df9bf95b-dc8f-4104-9c6c-873159393850\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.591942 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df9bf95b-dc8f-4104-9c6c-873159393850-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qb9bp\" (UID: \"df9bf95b-dc8f-4104-9c6c-873159393850\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.591995 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/df9bf95b-dc8f-4104-9c6c-873159393850-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qb9bp\" (UID: \"df9bf95b-dc8f-4104-9c6c-873159393850\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.593496 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df9bf95b-dc8f-4104-9c6c-873159393850-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qb9bp\" (UID: \"df9bf95b-dc8f-4104-9c6c-873159393850\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.597895 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/df9bf95b-dc8f-4104-9c6c-873159393850-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qb9bp\" (UID: \"df9bf95b-dc8f-4104-9c6c-873159393850\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.611117 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-628d5\" (UniqueName: \"kubernetes.io/projected/df9bf95b-dc8f-4104-9c6c-873159393850-kube-api-access-628d5\") pod \"marketplace-operator-79b997595-qb9bp\" (UID: \"df9bf95b-dc8f-4104-9c6c-873159393850\") " pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.617613 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.801353 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.954612 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.959142 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.965508 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:31:01 crc kubenswrapper[4839]: I0321 04:31:01.970909 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:01.998393 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krww4\" (UniqueName: \"kubernetes.io/projected/65a571df-f531-458b-9aed-6de99e4607e1-kube-api-access-krww4\") pod \"65a571df-f531-458b-9aed-6de99e4607e1\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:01.998478 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-catalog-content\") pod \"65a571df-f531-458b-9aed-6de99e4607e1\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:01.998583 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-utilities\") pod \"65a571df-f531-458b-9aed-6de99e4607e1\" (UID: \"65a571df-f531-458b-9aed-6de99e4607e1\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.002630 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-utilities" (OuterVolumeSpecName: "utilities") pod "65a571df-f531-458b-9aed-6de99e4607e1" (UID: "65a571df-f531-458b-9aed-6de99e4607e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.003734 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a571df-f531-458b-9aed-6de99e4607e1-kube-api-access-krww4" (OuterVolumeSpecName: "kube-api-access-krww4") pod "65a571df-f531-458b-9aed-6de99e4607e1" (UID: "65a571df-f531-458b-9aed-6de99e4607e1"). InnerVolumeSpecName "kube-api-access-krww4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.088753 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65a571df-f531-458b-9aed-6de99e4607e1" (UID: "65a571df-f531-458b-9aed-6de99e4607e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099366 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jctj4\" (UniqueName: \"kubernetes.io/projected/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-kube-api-access-jctj4\") pod \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099430 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-catalog-content\") pod \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099461 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-catalog-content\") pod \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099484 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-utilities\") pod \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\" (UID: \"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099515 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v69rn\" (UniqueName: \"kubernetes.io/projected/6240548e-b827-4fdb-b2be-c7187d6a28e8-kube-api-access-v69rn\") pod \"6240548e-b827-4fdb-b2be-c7187d6a28e8\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099597 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-operator-metrics\") pod \"6240548e-b827-4fdb-b2be-c7187d6a28e8\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099643 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-catalog-content\") pod \"6513c45b-dd98-40b0-b69c-94db4d1c916e\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099668 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt4t7\" (UniqueName: \"kubernetes.io/projected/6513c45b-dd98-40b0-b69c-94db4d1c916e-kube-api-access-nt4t7\") pod \"6513c45b-dd98-40b0-b69c-94db4d1c916e\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099696 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-trusted-ca\") pod \"6240548e-b827-4fdb-b2be-c7187d6a28e8\" (UID: \"6240548e-b827-4fdb-b2be-c7187d6a28e8\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099717 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-utilities\") pod \"6513c45b-dd98-40b0-b69c-94db4d1c916e\" (UID: \"6513c45b-dd98-40b0-b69c-94db4d1c916e\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099754 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncnmc\" (UniqueName: \"kubernetes.io/projected/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-kube-api-access-ncnmc\") pod \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.099783 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-utilities\") pod \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\" (UID: \"0b7a7313-21c4-4909-9ebe-ebe552b29b8c\") " Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.100040 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.100067 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krww4\" (UniqueName: \"kubernetes.io/projected/65a571df-f531-458b-9aed-6de99e4607e1-kube-api-access-krww4\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.100081 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a571df-f531-458b-9aed-6de99e4607e1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.100761 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-utilities" (OuterVolumeSpecName: "utilities") pod "0b7a7313-21c4-4909-9ebe-ebe552b29b8c" (UID: "0b7a7313-21c4-4909-9ebe-ebe552b29b8c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.102623 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-utilities" (OuterVolumeSpecName: "utilities") pod "5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" (UID: "5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.103295 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6240548e-b827-4fdb-b2be-c7187d6a28e8" (UID: "6240548e-b827-4fdb-b2be-c7187d6a28e8"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.103821 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-utilities" (OuterVolumeSpecName: "utilities") pod "6513c45b-dd98-40b0-b69c-94db4d1c916e" (UID: "6513c45b-dd98-40b0-b69c-94db4d1c916e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.105225 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6513c45b-dd98-40b0-b69c-94db4d1c916e-kube-api-access-nt4t7" (OuterVolumeSpecName: "kube-api-access-nt4t7") pod "6513c45b-dd98-40b0-b69c-94db4d1c916e" (UID: "6513c45b-dd98-40b0-b69c-94db4d1c916e"). InnerVolumeSpecName "kube-api-access-nt4t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.105507 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-kube-api-access-ncnmc" (OuterVolumeSpecName: "kube-api-access-ncnmc") pod "0b7a7313-21c4-4909-9ebe-ebe552b29b8c" (UID: "0b7a7313-21c4-4909-9ebe-ebe552b29b8c"). InnerVolumeSpecName "kube-api-access-ncnmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.105878 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-kube-api-access-jctj4" (OuterVolumeSpecName: "kube-api-access-jctj4") pod "5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" (UID: "5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c"). InnerVolumeSpecName "kube-api-access-jctj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.106137 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6240548e-b827-4fdb-b2be-c7187d6a28e8" (UID: "6240548e-b827-4fdb-b2be-c7187d6a28e8"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.106904 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6240548e-b827-4fdb-b2be-c7187d6a28e8-kube-api-access-v69rn" (OuterVolumeSpecName: "kube-api-access-v69rn") pod "6240548e-b827-4fdb-b2be-c7187d6a28e8" (UID: "6240548e-b827-4fdb-b2be-c7187d6a28e8"). InnerVolumeSpecName "kube-api-access-v69rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.130552 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b7a7313-21c4-4909-9ebe-ebe552b29b8c" (UID: "0b7a7313-21c4-4909-9ebe-ebe552b29b8c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.154970 4839 generic.go:334] "Generic (PLEG): container finished" podID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerID="f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4" exitCode=0 Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.155041 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxrc8" event={"ID":"6513c45b-dd98-40b0-b69c-94db4d1c916e","Type":"ContainerDied","Data":"f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4"} Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.155053 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxrc8" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.155067 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxrc8" event={"ID":"6513c45b-dd98-40b0-b69c-94db4d1c916e","Type":"ContainerDied","Data":"3c535ea31a5aa838095ae16f33b0780c50c3c3698c73e47ae8e3f30c17a3ac39"} Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.155085 4839 scope.go:117] "RemoveContainer" containerID="f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.157646 4839 generic.go:334] "Generic (PLEG): container finished" podID="65a571df-f531-458b-9aed-6de99e4607e1" containerID="3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4" exitCode=0 Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.157699 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nw7r6" event={"ID":"65a571df-f531-458b-9aed-6de99e4607e1","Type":"ContainerDied","Data":"3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4"} Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.157722 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nw7r6" event={"ID":"65a571df-f531-458b-9aed-6de99e4607e1","Type":"ContainerDied","Data":"b5f435157e1b2e816a83545c0d59dbf17d3143a0eb363bf4e4b546731c0c8b35"} Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.157773 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nw7r6" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.170362 4839 generic.go:334] "Generic (PLEG): container finished" podID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerID="e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817" exitCode=0 Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.170427 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgfcm" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.170434 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgfcm" event={"ID":"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c","Type":"ContainerDied","Data":"e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817"} Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.170469 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgfcm" event={"ID":"5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c","Type":"ContainerDied","Data":"45ca59ee6d68e70db13f642a35e227f7dae46d5a40341a7fcc4d0c33d12ae8bf"} Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.171457 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qb9bp"] Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.173659 4839 generic.go:334] "Generic (PLEG): container finished" podID="6240548e-b827-4fdb-b2be-c7187d6a28e8" containerID="10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9" exitCode=0 Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.173702 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" event={"ID":"6240548e-b827-4fdb-b2be-c7187d6a28e8","Type":"ContainerDied","Data":"10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9"} Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.173720 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" event={"ID":"6240548e-b827-4fdb-b2be-c7187d6a28e8","Type":"ContainerDied","Data":"dec41352b22dc4b1f265aecf13bbf9f995403b64a2bfc4f44c88616523722931"} Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.173773 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8jgh7" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.180143 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6513c45b-dd98-40b0-b69c-94db4d1c916e" (UID: "6513c45b-dd98-40b0-b69c-94db4d1c916e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.183027 4839 scope.go:117] "RemoveContainer" containerID="8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.205766 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v69rn\" (UniqueName: \"kubernetes.io/projected/6240548e-b827-4fdb-b2be-c7187d6a28e8-kube-api-access-v69rn\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.205838 4839 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.205857 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.205871 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt4t7\" (UniqueName: \"kubernetes.io/projected/6513c45b-dd98-40b0-b69c-94db4d1c916e-kube-api-access-nt4t7\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.205890 4839 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6240548e-b827-4fdb-b2be-c7187d6a28e8-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.205929 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6513c45b-dd98-40b0-b69c-94db4d1c916e-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.205942 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncnmc\" (UniqueName: \"kubernetes.io/projected/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-kube-api-access-ncnmc\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.205960 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.205972 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jctj4\" (UniqueName: \"kubernetes.io/projected/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-kube-api-access-jctj4\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.206017 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b7a7313-21c4-4909-9ebe-ebe552b29b8c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.206035 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.211114 4839 generic.go:334] "Generic (PLEG): container finished" podID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerID="afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a" exitCode=0 Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.211187 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qjgq" event={"ID":"0b7a7313-21c4-4909-9ebe-ebe552b29b8c","Type":"ContainerDied","Data":"afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a"} Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.211220 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9qjgq" event={"ID":"0b7a7313-21c4-4909-9ebe-ebe552b29b8c","Type":"ContainerDied","Data":"6a5663fd0eb16a90e793ba0b93994b3affe90036f9e0e38ea8915b0da62b0425"} Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.211528 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9qjgq" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.221188 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nw7r6"] Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.225411 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nw7r6"] Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.229398 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8jgh7"] Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.233277 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8jgh7"] Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.239024 4839 scope.go:117] "RemoveContainer" containerID="ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.245414 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qjgq"] Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.251709 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9qjgq"] Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.259113 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" (UID: "5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.276294 4839 scope.go:117] "RemoveContainer" containerID="f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.276971 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4\": container with ID starting with f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4 not found: ID does not exist" containerID="f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.277008 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4"} err="failed to get container status \"f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4\": rpc error: code = NotFound desc = could not find container \"f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4\": container with ID starting with f8a0b041aeabd67569d74e5792234c2d8b9df0f165a69765fe545ae2e64554e4 not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.277048 4839 scope.go:117] "RemoveContainer" containerID="8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.277270 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730\": container with ID starting with 8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730 not found: ID does not exist" containerID="8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.277312 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730"} err="failed to get container status \"8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730\": rpc error: code = NotFound desc = could not find container \"8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730\": container with ID starting with 8742ce04852792a9b09d21f9082b4080eead694d53c96e4113cd80bb39500730 not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.277327 4839 scope.go:117] "RemoveContainer" containerID="ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.277536 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127\": container with ID starting with ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127 not found: ID does not exist" containerID="ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.277557 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127"} err="failed to get container status \"ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127\": rpc error: code = NotFound desc = could not find container \"ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127\": container with ID starting with ef36d2755a7903c4bb91306b919a88606f1cdd5839d1cc954ceec3191fcac127 not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.277584 4839 scope.go:117] "RemoveContainer" containerID="3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.295981 4839 scope.go:117] "RemoveContainer" containerID="efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.307968 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.320844 4839 scope.go:117] "RemoveContainer" containerID="b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.338186 4839 scope.go:117] "RemoveContainer" containerID="3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.338574 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4\": container with ID starting with 3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4 not found: ID does not exist" containerID="3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.338614 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4"} err="failed to get container status \"3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4\": rpc error: code = NotFound desc = could not find container \"3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4\": container with ID starting with 3e8fc037dfcad47120a88d3f8d35bf421fba0e269bb8bf6bdf7fceddbf556bd4 not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.338640 4839 scope.go:117] "RemoveContainer" containerID="efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.339010 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb\": container with ID starting with efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb not found: ID does not exist" containerID="efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.339046 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb"} err="failed to get container status \"efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb\": rpc error: code = NotFound desc = could not find container \"efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb\": container with ID starting with efc4670847f72c6bb6856a32dc6e4a2a1da439face51ab3c5137f5668c3f5ffb not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.339067 4839 scope.go:117] "RemoveContainer" containerID="b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.339290 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6\": container with ID starting with b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6 not found: ID does not exist" containerID="b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.339304 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6"} err="failed to get container status \"b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6\": rpc error: code = NotFound desc = could not find container \"b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6\": container with ID starting with b1d917a6edaeb5ac2761ae508c1b9b4542abdb7660fa75528b425c220749f0c6 not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.339315 4839 scope.go:117] "RemoveContainer" containerID="e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.349923 4839 scope.go:117] "RemoveContainer" containerID="7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.362778 4839 scope.go:117] "RemoveContainer" containerID="976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.376331 4839 scope.go:117] "RemoveContainer" containerID="e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.381745 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817\": container with ID starting with e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817 not found: ID does not exist" containerID="e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.381793 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817"} err="failed to get container status \"e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817\": rpc error: code = NotFound desc = could not find container \"e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817\": container with ID starting with e513246e76bc96d307d94fb57365aba010eb2b18b93b54a3b78c0eba14929817 not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.381821 4839 scope.go:117] "RemoveContainer" containerID="7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.382883 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680\": container with ID starting with 7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680 not found: ID does not exist" containerID="7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.382934 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680"} err="failed to get container status \"7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680\": rpc error: code = NotFound desc = could not find container \"7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680\": container with ID starting with 7bce8f9c3683489f31e4a7ec0257da3a6aef6ea02766a2df62cfae9928d74680 not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.382968 4839 scope.go:117] "RemoveContainer" containerID="976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.383414 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9\": container with ID starting with 976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9 not found: ID does not exist" containerID="976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.383433 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9"} err="failed to get container status \"976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9\": rpc error: code = NotFound desc = could not find container \"976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9\": container with ID starting with 976423ef41aaae33850c00fcd3501af4196d38bd3ce2a5845a341191caf443f9 not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.383447 4839 scope.go:117] "RemoveContainer" containerID="10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.395559 4839 scope.go:117] "RemoveContainer" containerID="10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.395971 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9\": container with ID starting with 10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9 not found: ID does not exist" containerID="10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.396006 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9"} err="failed to get container status \"10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9\": rpc error: code = NotFound desc = could not find container \"10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9\": container with ID starting with 10d054bdba792021ed12279ec4d8798c4bb962afe280d6356a1f51072ca850c9 not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.396063 4839 scope.go:117] "RemoveContainer" containerID="afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.407864 4839 scope.go:117] "RemoveContainer" containerID="32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.455809 4839 scope.go:117] "RemoveContainer" containerID="1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.460508 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" path="/var/lib/kubelet/pods/0b7a7313-21c4-4909-9ebe-ebe552b29b8c/volumes" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.461311 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6240548e-b827-4fdb-b2be-c7187d6a28e8" path="/var/lib/kubelet/pods/6240548e-b827-4fdb-b2be-c7187d6a28e8/volumes" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.461875 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65a571df-f531-458b-9aed-6de99e4607e1" path="/var/lib/kubelet/pods/65a571df-f531-458b-9aed-6de99e4607e1/volumes" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.477736 4839 scope.go:117] "RemoveContainer" containerID="afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.478168 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a\": container with ID starting with afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a not found: ID does not exist" containerID="afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.478202 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a"} err="failed to get container status \"afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a\": rpc error: code = NotFound desc = could not find container \"afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a\": container with ID starting with afcbe40293e4b6e4d4a36e26fb65a125e0864be6010a0ab4223765b126370c8a not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.478228 4839 scope.go:117] "RemoveContainer" containerID="32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.478550 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e\": container with ID starting with 32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e not found: ID does not exist" containerID="32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.478585 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e"} err="failed to get container status \"32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e\": rpc error: code = NotFound desc = could not find container \"32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e\": container with ID starting with 32838aa37e22f65845d98772270e4788d8051a89bcd948b68704534b2f4fcf1e not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.478599 4839 scope.go:117] "RemoveContainer" containerID="1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231" Mar 21 04:31:02 crc kubenswrapper[4839]: E0321 04:31:02.478797 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231\": container with ID starting with 1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231 not found: ID does not exist" containerID="1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.478817 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231"} err="failed to get container status \"1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231\": rpc error: code = NotFound desc = could not find container \"1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231\": container with ID starting with 1f40b3f629e9c306e4ab3dffa3f2ab389b605eb7d1dc90763bed0504eb05b231 not found: ID does not exist" Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.500401 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxrc8"] Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.505581 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mxrc8"] Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.516755 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zgfcm"] Mar 21 04:31:02 crc kubenswrapper[4839]: I0321 04:31:02.520954 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zgfcm"] Mar 21 04:31:03 crc kubenswrapper[4839]: I0321 04:31:03.220086 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" event={"ID":"df9bf95b-dc8f-4104-9c6c-873159393850","Type":"ContainerStarted","Data":"a61fd4afaf9ad306000624583773ad7dfede05b26f16296be0baf8474b7fb7ab"} Mar 21 04:31:03 crc kubenswrapper[4839]: I0321 04:31:03.220143 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" event={"ID":"df9bf95b-dc8f-4104-9c6c-873159393850","Type":"ContainerStarted","Data":"58549bd273c0de0af6511133c6dd48f3904f237e11edf620a85c316cab17625e"} Mar 21 04:31:03 crc kubenswrapper[4839]: I0321 04:31:03.220322 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:03 crc kubenswrapper[4839]: I0321 04:31:03.225758 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" Mar 21 04:31:03 crc kubenswrapper[4839]: I0321 04:31:03.245393 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qb9bp" podStartSLOduration=2.245375078 podStartE2EDuration="2.245375078s" podCreationTimestamp="2026-03-21 04:31:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:31:03.23853028 +0000 UTC m=+467.566316986" watchObservedRunningTime="2026-03-21 04:31:03.245375078 +0000 UTC m=+467.573161764" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.307673 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7sxqv"] Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308222 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerName="extract-content" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308235 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerName="extract-content" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308256 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308263 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308274 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6240548e-b827-4fdb-b2be-c7187d6a28e8" containerName="marketplace-operator" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308281 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6240548e-b827-4fdb-b2be-c7187d6a28e8" containerName="marketplace-operator" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308290 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308296 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308308 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerName="extract-utilities" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308316 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerName="extract-utilities" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308329 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a571df-f531-458b-9aed-6de99e4607e1" containerName="extract-utilities" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308336 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a571df-f531-458b-9aed-6de99e4607e1" containerName="extract-utilities" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308347 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerName="extract-content" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308355 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerName="extract-content" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308365 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308370 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308378 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a571df-f531-458b-9aed-6de99e4607e1" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308384 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a571df-f531-458b-9aed-6de99e4607e1" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308392 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerName="extract-utilities" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308398 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerName="extract-utilities" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308406 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerName="extract-content" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308411 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerName="extract-content" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308420 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a571df-f531-458b-9aed-6de99e4607e1" containerName="extract-content" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308425 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a571df-f531-458b-9aed-6de99e4607e1" containerName="extract-content" Mar 21 04:31:04 crc kubenswrapper[4839]: E0321 04:31:04.308433 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerName="extract-utilities" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308447 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerName="extract-utilities" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308529 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a571df-f531-458b-9aed-6de99e4607e1" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308542 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308550 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6240548e-b827-4fdb-b2be-c7187d6a28e8" containerName="marketplace-operator" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308557 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.308590 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b7a7313-21c4-4909-9ebe-ebe552b29b8c" containerName="registry-server" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.310204 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.316002 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7sxqv"] Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.318260 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.330746 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vltzp\" (UniqueName: \"kubernetes.io/projected/e28c0850-90f8-445b-be34-13ab0d940eb4-kube-api-access-vltzp\") pod \"redhat-marketplace-7sxqv\" (UID: \"e28c0850-90f8-445b-be34-13ab0d940eb4\") " pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.330807 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28c0850-90f8-445b-be34-13ab0d940eb4-catalog-content\") pod \"redhat-marketplace-7sxqv\" (UID: \"e28c0850-90f8-445b-be34-13ab0d940eb4\") " pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.330844 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28c0850-90f8-445b-be34-13ab0d940eb4-utilities\") pod \"redhat-marketplace-7sxqv\" (UID: \"e28c0850-90f8-445b-be34-13ab0d940eb4\") " pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.432525 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vltzp\" (UniqueName: \"kubernetes.io/projected/e28c0850-90f8-445b-be34-13ab0d940eb4-kube-api-access-vltzp\") pod \"redhat-marketplace-7sxqv\" (UID: \"e28c0850-90f8-445b-be34-13ab0d940eb4\") " pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.432606 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28c0850-90f8-445b-be34-13ab0d940eb4-catalog-content\") pod \"redhat-marketplace-7sxqv\" (UID: \"e28c0850-90f8-445b-be34-13ab0d940eb4\") " pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.432653 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28c0850-90f8-445b-be34-13ab0d940eb4-utilities\") pod \"redhat-marketplace-7sxqv\" (UID: \"e28c0850-90f8-445b-be34-13ab0d940eb4\") " pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.433129 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28c0850-90f8-445b-be34-13ab0d940eb4-utilities\") pod \"redhat-marketplace-7sxqv\" (UID: \"e28c0850-90f8-445b-be34-13ab0d940eb4\") " pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.433526 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28c0850-90f8-445b-be34-13ab0d940eb4-catalog-content\") pod \"redhat-marketplace-7sxqv\" (UID: \"e28c0850-90f8-445b-be34-13ab0d940eb4\") " pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.459061 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vltzp\" (UniqueName: \"kubernetes.io/projected/e28c0850-90f8-445b-be34-13ab0d940eb4-kube-api-access-vltzp\") pod \"redhat-marketplace-7sxqv\" (UID: \"e28c0850-90f8-445b-be34-13ab0d940eb4\") " pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.464713 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c" path="/var/lib/kubelet/pods/5d1d0c02-87bf-4c8b-bc1c-d25007fb3c1c/volumes" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.466438 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6513c45b-dd98-40b0-b69c-94db4d1c916e" path="/var/lib/kubelet/pods/6513c45b-dd98-40b0-b69c-94db4d1c916e/volumes" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.504046 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8p22k"] Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.505036 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.510044 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.514959 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8p22k"] Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.634343 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.635439 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4chfq\" (UniqueName: \"kubernetes.io/projected/d2de7c7a-fc46-44bc-9fad-d346e82f8ebc-kube-api-access-4chfq\") pod \"redhat-operators-8p22k\" (UID: \"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc\") " pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.635543 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2de7c7a-fc46-44bc-9fad-d346e82f8ebc-utilities\") pod \"redhat-operators-8p22k\" (UID: \"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc\") " pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.635594 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2de7c7a-fc46-44bc-9fad-d346e82f8ebc-catalog-content\") pod \"redhat-operators-8p22k\" (UID: \"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc\") " pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.738668 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2de7c7a-fc46-44bc-9fad-d346e82f8ebc-utilities\") pod \"redhat-operators-8p22k\" (UID: \"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc\") " pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.739106 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2de7c7a-fc46-44bc-9fad-d346e82f8ebc-utilities\") pod \"redhat-operators-8p22k\" (UID: \"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc\") " pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.739109 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2de7c7a-fc46-44bc-9fad-d346e82f8ebc-catalog-content\") pod \"redhat-operators-8p22k\" (UID: \"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc\") " pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.739183 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4chfq\" (UniqueName: \"kubernetes.io/projected/d2de7c7a-fc46-44bc-9fad-d346e82f8ebc-kube-api-access-4chfq\") pod \"redhat-operators-8p22k\" (UID: \"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc\") " pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.739748 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2de7c7a-fc46-44bc-9fad-d346e82f8ebc-catalog-content\") pod \"redhat-operators-8p22k\" (UID: \"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc\") " pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.758394 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4chfq\" (UniqueName: \"kubernetes.io/projected/d2de7c7a-fc46-44bc-9fad-d346e82f8ebc-kube-api-access-4chfq\") pod \"redhat-operators-8p22k\" (UID: \"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc\") " pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:04 crc kubenswrapper[4839]: I0321 04:31:04.846879 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:05 crc kubenswrapper[4839]: I0321 04:31:05.043096 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7sxqv"] Mar 21 04:31:05 crc kubenswrapper[4839]: W0321 04:31:05.050986 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode28c0850_90f8_445b_be34_13ab0d940eb4.slice/crio-7df77261ae0a5009d090b007920101a824ad5b5e9d1bf8636cce9c00538f7cd9 WatchSource:0}: Error finding container 7df77261ae0a5009d090b007920101a824ad5b5e9d1bf8636cce9c00538f7cd9: Status 404 returned error can't find the container with id 7df77261ae0a5009d090b007920101a824ad5b5e9d1bf8636cce9c00538f7cd9 Mar 21 04:31:05 crc kubenswrapper[4839]: I0321 04:31:05.249721 4839 generic.go:334] "Generic (PLEG): container finished" podID="e28c0850-90f8-445b-be34-13ab0d940eb4" containerID="671e02cd5c07dca793f66a536c30face9575efef661dd6ea8f93ced61743edb3" exitCode=0 Mar 21 04:31:05 crc kubenswrapper[4839]: I0321 04:31:05.249942 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sxqv" event={"ID":"e28c0850-90f8-445b-be34-13ab0d940eb4","Type":"ContainerDied","Data":"671e02cd5c07dca793f66a536c30face9575efef661dd6ea8f93ced61743edb3"} Mar 21 04:31:05 crc kubenswrapper[4839]: I0321 04:31:05.250091 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sxqv" event={"ID":"e28c0850-90f8-445b-be34-13ab0d940eb4","Type":"ContainerStarted","Data":"7df77261ae0a5009d090b007920101a824ad5b5e9d1bf8636cce9c00538f7cd9"} Mar 21 04:31:05 crc kubenswrapper[4839]: I0321 04:31:05.875863 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8p22k"] Mar 21 04:31:05 crc kubenswrapper[4839]: W0321 04:31:05.881413 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2de7c7a_fc46_44bc_9fad_d346e82f8ebc.slice/crio-c79733c0a493e6a45d45cf07c4d3921f3a9afa3de87779dbb72ce95b7d3d98ae WatchSource:0}: Error finding container c79733c0a493e6a45d45cf07c4d3921f3a9afa3de87779dbb72ce95b7d3d98ae: Status 404 returned error can't find the container with id c79733c0a493e6a45d45cf07c4d3921f3a9afa3de87779dbb72ce95b7d3d98ae Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.255243 4839 generic.go:334] "Generic (PLEG): container finished" podID="d2de7c7a-fc46-44bc-9fad-d346e82f8ebc" containerID="7ac3f706c4d746984f3de670c6eb113b6380888615edcc9dab4dfbd09d139f6d" exitCode=0 Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.255350 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p22k" event={"ID":"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc","Type":"ContainerDied","Data":"7ac3f706c4d746984f3de670c6eb113b6380888615edcc9dab4dfbd09d139f6d"} Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.255657 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p22k" event={"ID":"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc","Type":"ContainerStarted","Data":"c79733c0a493e6a45d45cf07c4d3921f3a9afa3de87779dbb72ce95b7d3d98ae"} Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.257436 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sxqv" event={"ID":"e28c0850-90f8-445b-be34-13ab0d940eb4","Type":"ContainerStarted","Data":"c4696895008d77bf85dc49e81c28be91989a3ba193851694f6ef9a99a11c30d0"} Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.704411 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-99hx2"] Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.712698 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.718266 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.724668 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-99hx2"] Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.766364 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpg48\" (UniqueName: \"kubernetes.io/projected/51f96bb3-505b-4c7b-bc6d-b0a465c7daae-kube-api-access-lpg48\") pod \"community-operators-99hx2\" (UID: \"51f96bb3-505b-4c7b-bc6d-b0a465c7daae\") " pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.766448 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51f96bb3-505b-4c7b-bc6d-b0a465c7daae-utilities\") pod \"community-operators-99hx2\" (UID: \"51f96bb3-505b-4c7b-bc6d-b0a465c7daae\") " pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.766500 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51f96bb3-505b-4c7b-bc6d-b0a465c7daae-catalog-content\") pod \"community-operators-99hx2\" (UID: \"51f96bb3-505b-4c7b-bc6d-b0a465c7daae\") " pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.867266 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpg48\" (UniqueName: \"kubernetes.io/projected/51f96bb3-505b-4c7b-bc6d-b0a465c7daae-kube-api-access-lpg48\") pod \"community-operators-99hx2\" (UID: \"51f96bb3-505b-4c7b-bc6d-b0a465c7daae\") " pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.867327 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51f96bb3-505b-4c7b-bc6d-b0a465c7daae-utilities\") pod \"community-operators-99hx2\" (UID: \"51f96bb3-505b-4c7b-bc6d-b0a465c7daae\") " pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.867366 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51f96bb3-505b-4c7b-bc6d-b0a465c7daae-catalog-content\") pod \"community-operators-99hx2\" (UID: \"51f96bb3-505b-4c7b-bc6d-b0a465c7daae\") " pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.867788 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51f96bb3-505b-4c7b-bc6d-b0a465c7daae-utilities\") pod \"community-operators-99hx2\" (UID: \"51f96bb3-505b-4c7b-bc6d-b0a465c7daae\") " pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.867818 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51f96bb3-505b-4c7b-bc6d-b0a465c7daae-catalog-content\") pod \"community-operators-99hx2\" (UID: \"51f96bb3-505b-4c7b-bc6d-b0a465c7daae\") " pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.902624 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xg8xw"] Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.903547 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.911181 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.915656 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpg48\" (UniqueName: \"kubernetes.io/projected/51f96bb3-505b-4c7b-bc6d-b0a465c7daae-kube-api-access-lpg48\") pod \"community-operators-99hx2\" (UID: \"51f96bb3-505b-4c7b-bc6d-b0a465c7daae\") " pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:06 crc kubenswrapper[4839]: I0321 04:31:06.915765 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xg8xw"] Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.028967 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.069498 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d4943ad-c109-47a0-bcc8-4eb1a89836ca-catalog-content\") pod \"certified-operators-xg8xw\" (UID: \"1d4943ad-c109-47a0-bcc8-4eb1a89836ca\") " pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.069876 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gc48\" (UniqueName: \"kubernetes.io/projected/1d4943ad-c109-47a0-bcc8-4eb1a89836ca-kube-api-access-4gc48\") pod \"certified-operators-xg8xw\" (UID: \"1d4943ad-c109-47a0-bcc8-4eb1a89836ca\") " pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.069921 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d4943ad-c109-47a0-bcc8-4eb1a89836ca-utilities\") pod \"certified-operators-xg8xw\" (UID: \"1d4943ad-c109-47a0-bcc8-4eb1a89836ca\") " pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.170642 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d4943ad-c109-47a0-bcc8-4eb1a89836ca-catalog-content\") pod \"certified-operators-xg8xw\" (UID: \"1d4943ad-c109-47a0-bcc8-4eb1a89836ca\") " pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.170694 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gc48\" (UniqueName: \"kubernetes.io/projected/1d4943ad-c109-47a0-bcc8-4eb1a89836ca-kube-api-access-4gc48\") pod \"certified-operators-xg8xw\" (UID: \"1d4943ad-c109-47a0-bcc8-4eb1a89836ca\") " pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.170733 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d4943ad-c109-47a0-bcc8-4eb1a89836ca-utilities\") pod \"certified-operators-xg8xw\" (UID: \"1d4943ad-c109-47a0-bcc8-4eb1a89836ca\") " pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.171173 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d4943ad-c109-47a0-bcc8-4eb1a89836ca-utilities\") pod \"certified-operators-xg8xw\" (UID: \"1d4943ad-c109-47a0-bcc8-4eb1a89836ca\") " pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.171279 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d4943ad-c109-47a0-bcc8-4eb1a89836ca-catalog-content\") pod \"certified-operators-xg8xw\" (UID: \"1d4943ad-c109-47a0-bcc8-4eb1a89836ca\") " pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.188274 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gc48\" (UniqueName: \"kubernetes.io/projected/1d4943ad-c109-47a0-bcc8-4eb1a89836ca-kube-api-access-4gc48\") pod \"certified-operators-xg8xw\" (UID: \"1d4943ad-c109-47a0-bcc8-4eb1a89836ca\") " pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.225881 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.276697 4839 generic.go:334] "Generic (PLEG): container finished" podID="e28c0850-90f8-445b-be34-13ab0d940eb4" containerID="c4696895008d77bf85dc49e81c28be91989a3ba193851694f6ef9a99a11c30d0" exitCode=0 Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.276917 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sxqv" event={"ID":"e28c0850-90f8-445b-be34-13ab0d940eb4","Type":"ContainerDied","Data":"c4696895008d77bf85dc49e81c28be91989a3ba193851694f6ef9a99a11c30d0"} Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.478738 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-99hx2"] Mar 21 04:31:07 crc kubenswrapper[4839]: W0321 04:31:07.489036 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51f96bb3_505b_4c7b_bc6d_b0a465c7daae.slice/crio-5dfbf9a67af37390466b9ccf173193b1784ffd5fc8fd6f8770a7a80bfe124608 WatchSource:0}: Error finding container 5dfbf9a67af37390466b9ccf173193b1784ffd5fc8fd6f8770a7a80bfe124608: Status 404 returned error can't find the container with id 5dfbf9a67af37390466b9ccf173193b1784ffd5fc8fd6f8770a7a80bfe124608 Mar 21 04:31:07 crc kubenswrapper[4839]: I0321 04:31:07.623923 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xg8xw"] Mar 21 04:31:08 crc kubenswrapper[4839]: I0321 04:31:08.284663 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sxqv" event={"ID":"e28c0850-90f8-445b-be34-13ab0d940eb4","Type":"ContainerStarted","Data":"ad183f3dc059b05c56ba2d142f216f479d5cd17174ed23542746efc702343255"} Mar 21 04:31:08 crc kubenswrapper[4839]: I0321 04:31:08.288422 4839 generic.go:334] "Generic (PLEG): container finished" podID="d2de7c7a-fc46-44bc-9fad-d346e82f8ebc" containerID="78d7e4f0f24f3717e0672a2931c7a29a770677751b0f588204cc8056a3eacae1" exitCode=0 Mar 21 04:31:08 crc kubenswrapper[4839]: I0321 04:31:08.288471 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p22k" event={"ID":"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc","Type":"ContainerDied","Data":"78d7e4f0f24f3717e0672a2931c7a29a770677751b0f588204cc8056a3eacae1"} Mar 21 04:31:08 crc kubenswrapper[4839]: I0321 04:31:08.291594 4839 generic.go:334] "Generic (PLEG): container finished" podID="1d4943ad-c109-47a0-bcc8-4eb1a89836ca" containerID="8dc14d18d8cf81dc96e16afef9fbbd62f7059647583f5f33c915a3c943cd863d" exitCode=0 Mar 21 04:31:08 crc kubenswrapper[4839]: I0321 04:31:08.292134 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xg8xw" event={"ID":"1d4943ad-c109-47a0-bcc8-4eb1a89836ca","Type":"ContainerDied","Data":"8dc14d18d8cf81dc96e16afef9fbbd62f7059647583f5f33c915a3c943cd863d"} Mar 21 04:31:08 crc kubenswrapper[4839]: I0321 04:31:08.292186 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xg8xw" event={"ID":"1d4943ad-c109-47a0-bcc8-4eb1a89836ca","Type":"ContainerStarted","Data":"8425816cb044e883f5be89447f594c78ef1d0a1b7dc84bbfc79a41aee4e2ccff"} Mar 21 04:31:08 crc kubenswrapper[4839]: I0321 04:31:08.296454 4839 generic.go:334] "Generic (PLEG): container finished" podID="51f96bb3-505b-4c7b-bc6d-b0a465c7daae" containerID="cbac869c90a20289dbbcaf58c9bf7ffae0fa45f3c2c8a2348fee1c393d4640c5" exitCode=0 Mar 21 04:31:08 crc kubenswrapper[4839]: I0321 04:31:08.296498 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99hx2" event={"ID":"51f96bb3-505b-4c7b-bc6d-b0a465c7daae","Type":"ContainerDied","Data":"cbac869c90a20289dbbcaf58c9bf7ffae0fa45f3c2c8a2348fee1c393d4640c5"} Mar 21 04:31:08 crc kubenswrapper[4839]: I0321 04:31:08.296529 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99hx2" event={"ID":"51f96bb3-505b-4c7b-bc6d-b0a465c7daae","Type":"ContainerStarted","Data":"5dfbf9a67af37390466b9ccf173193b1784ffd5fc8fd6f8770a7a80bfe124608"} Mar 21 04:31:08 crc kubenswrapper[4839]: I0321 04:31:08.312413 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7sxqv" podStartSLOduration=1.802340448 podStartE2EDuration="4.312395824s" podCreationTimestamp="2026-03-21 04:31:04 +0000 UTC" firstStartedPulling="2026-03-21 04:31:05.255042666 +0000 UTC m=+469.582829342" lastFinishedPulling="2026-03-21 04:31:07.765098042 +0000 UTC m=+472.092884718" observedRunningTime="2026-03-21 04:31:08.308792695 +0000 UTC m=+472.636579381" watchObservedRunningTime="2026-03-21 04:31:08.312395824 +0000 UTC m=+472.640182500" Mar 21 04:31:09 crc kubenswrapper[4839]: I0321 04:31:09.306123 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p22k" event={"ID":"d2de7c7a-fc46-44bc-9fad-d346e82f8ebc","Type":"ContainerStarted","Data":"eaf8ffe3b2691785f746aa6a66a7e03fa72af7beca0e7bf2a2ca95ff339769f2"} Mar 21 04:31:09 crc kubenswrapper[4839]: I0321 04:31:09.307930 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xg8xw" event={"ID":"1d4943ad-c109-47a0-bcc8-4eb1a89836ca","Type":"ContainerStarted","Data":"ca17604b0621caac86216daea7679a5a305dc92873a2d65515d33fbecb8395bf"} Mar 21 04:31:09 crc kubenswrapper[4839]: I0321 04:31:09.329257 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8p22k" podStartSLOduration=2.9310292860000002 podStartE2EDuration="5.329241078s" podCreationTimestamp="2026-03-21 04:31:04 +0000 UTC" firstStartedPulling="2026-03-21 04:31:06.257961519 +0000 UTC m=+470.585748195" lastFinishedPulling="2026-03-21 04:31:08.656173301 +0000 UTC m=+472.983959987" observedRunningTime="2026-03-21 04:31:09.323968883 +0000 UTC m=+473.651755559" watchObservedRunningTime="2026-03-21 04:31:09.329241078 +0000 UTC m=+473.657027754" Mar 21 04:31:10 crc kubenswrapper[4839]: I0321 04:31:10.314747 4839 generic.go:334] "Generic (PLEG): container finished" podID="1d4943ad-c109-47a0-bcc8-4eb1a89836ca" containerID="ca17604b0621caac86216daea7679a5a305dc92873a2d65515d33fbecb8395bf" exitCode=0 Mar 21 04:31:10 crc kubenswrapper[4839]: I0321 04:31:10.314848 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xg8xw" event={"ID":"1d4943ad-c109-47a0-bcc8-4eb1a89836ca","Type":"ContainerDied","Data":"ca17604b0621caac86216daea7679a5a305dc92873a2d65515d33fbecb8395bf"} Mar 21 04:31:10 crc kubenswrapper[4839]: I0321 04:31:10.318812 4839 generic.go:334] "Generic (PLEG): container finished" podID="51f96bb3-505b-4c7b-bc6d-b0a465c7daae" containerID="aa598d041afcc73aec2b39e045139334e7de1f9ed9f722e0af6d0c3eaf74de2b" exitCode=0 Mar 21 04:31:10 crc kubenswrapper[4839]: I0321 04:31:10.319340 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99hx2" event={"ID":"51f96bb3-505b-4c7b-bc6d-b0a465c7daae","Type":"ContainerDied","Data":"aa598d041afcc73aec2b39e045139334e7de1f9ed9f722e0af6d0c3eaf74de2b"} Mar 21 04:31:11 crc kubenswrapper[4839]: I0321 04:31:11.022456 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-dq7r2" Mar 21 04:31:11 crc kubenswrapper[4839]: I0321 04:31:11.079698 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ql2ps"] Mar 21 04:31:11 crc kubenswrapper[4839]: I0321 04:31:11.325333 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-99hx2" event={"ID":"51f96bb3-505b-4c7b-bc6d-b0a465c7daae","Type":"ContainerStarted","Data":"54638761080df82c2fd1dbca89c408f24ffd27149bbba2519c39a4fa9f226ac7"} Mar 21 04:31:11 crc kubenswrapper[4839]: I0321 04:31:11.327392 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xg8xw" event={"ID":"1d4943ad-c109-47a0-bcc8-4eb1a89836ca","Type":"ContainerStarted","Data":"ceb673ff766e58c024b0d0da7fd37d67489e5e02914e27fa16da101592e95b29"} Mar 21 04:31:11 crc kubenswrapper[4839]: I0321 04:31:11.342960 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-99hx2" podStartSLOduration=2.7120386 podStartE2EDuration="5.342941647s" podCreationTimestamp="2026-03-21 04:31:06 +0000 UTC" firstStartedPulling="2026-03-21 04:31:08.298044651 +0000 UTC m=+472.625831327" lastFinishedPulling="2026-03-21 04:31:10.928947678 +0000 UTC m=+475.256734374" observedRunningTime="2026-03-21 04:31:11.34268919 +0000 UTC m=+475.670475866" watchObservedRunningTime="2026-03-21 04:31:11.342941647 +0000 UTC m=+475.670728323" Mar 21 04:31:11 crc kubenswrapper[4839]: I0321 04:31:11.362416 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xg8xw" podStartSLOduration=2.767659224 podStartE2EDuration="5.36239948s" podCreationTimestamp="2026-03-21 04:31:06 +0000 UTC" firstStartedPulling="2026-03-21 04:31:08.294103983 +0000 UTC m=+472.621890659" lastFinishedPulling="2026-03-21 04:31:10.888844239 +0000 UTC m=+475.216630915" observedRunningTime="2026-03-21 04:31:11.361335271 +0000 UTC m=+475.689121947" watchObservedRunningTime="2026-03-21 04:31:11.36239948 +0000 UTC m=+475.690186156" Mar 21 04:31:14 crc kubenswrapper[4839]: I0321 04:31:14.635445 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:14 crc kubenswrapper[4839]: I0321 04:31:14.635837 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:14 crc kubenswrapper[4839]: I0321 04:31:14.674692 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:14 crc kubenswrapper[4839]: I0321 04:31:14.847533 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:14 crc kubenswrapper[4839]: I0321 04:31:14.847779 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:15 crc kubenswrapper[4839]: I0321 04:31:15.412839 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7sxqv" Mar 21 04:31:15 crc kubenswrapper[4839]: I0321 04:31:15.895922 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8p22k" podUID="d2de7c7a-fc46-44bc-9fad-d346e82f8ebc" containerName="registry-server" probeResult="failure" output=< Mar 21 04:31:15 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 04:31:15 crc kubenswrapper[4839]: > Mar 21 04:31:17 crc kubenswrapper[4839]: I0321 04:31:17.029648 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:17 crc kubenswrapper[4839]: I0321 04:31:17.030625 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:17 crc kubenswrapper[4839]: I0321 04:31:17.065031 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:17 crc kubenswrapper[4839]: I0321 04:31:17.226620 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:17 crc kubenswrapper[4839]: I0321 04:31:17.226685 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:17 crc kubenswrapper[4839]: I0321 04:31:17.262641 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:17 crc kubenswrapper[4839]: I0321 04:31:17.393638 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xg8xw" Mar 21 04:31:17 crc kubenswrapper[4839]: I0321 04:31:17.394533 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-99hx2" Mar 21 04:31:24 crc kubenswrapper[4839]: I0321 04:31:24.895852 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:24 crc kubenswrapper[4839]: I0321 04:31:24.941086 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8p22k" Mar 21 04:31:30 crc kubenswrapper[4839]: I0321 04:31:30.980486 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:31:30 crc kubenswrapper[4839]: I0321 04:31:30.980933 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:31:35 crc kubenswrapper[4839]: I0321 04:31:35.659350 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.122348 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" podUID="7ef3f28d-e496-434e-a803-3b9a0fa24690" containerName="registry" containerID="cri-o://0b216795d8b50fc395a781f55afc6bd2e9902da0332fa52d6ee539b16a4c0446" gracePeriod=30 Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.485409 4839 generic.go:334] "Generic (PLEG): container finished" podID="7ef3f28d-e496-434e-a803-3b9a0fa24690" containerID="0b216795d8b50fc395a781f55afc6bd2e9902da0332fa52d6ee539b16a4c0446" exitCode=0 Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.485508 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" event={"ID":"7ef3f28d-e496-434e-a803-3b9a0fa24690","Type":"ContainerDied","Data":"0b216795d8b50fc395a781f55afc6bd2e9902da0332fa52d6ee539b16a4c0446"} Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.534757 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.660867 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-certificates\") pod \"7ef3f28d-e496-434e-a803-3b9a0fa24690\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.661145 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7ef3f28d-e496-434e-a803-3b9a0fa24690\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.661175 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfdvw\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-kube-api-access-pfdvw\") pod \"7ef3f28d-e496-434e-a803-3b9a0fa24690\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.661207 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7ef3f28d-e496-434e-a803-3b9a0fa24690-installation-pull-secrets\") pod \"7ef3f28d-e496-434e-a803-3b9a0fa24690\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.661229 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7ef3f28d-e496-434e-a803-3b9a0fa24690-ca-trust-extracted\") pod \"7ef3f28d-e496-434e-a803-3b9a0fa24690\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.661248 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-tls\") pod \"7ef3f28d-e496-434e-a803-3b9a0fa24690\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.662046 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7ef3f28d-e496-434e-a803-3b9a0fa24690" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.666698 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-bound-sa-token\") pod \"7ef3f28d-e496-434e-a803-3b9a0fa24690\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.666761 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-trusted-ca\") pod \"7ef3f28d-e496-434e-a803-3b9a0fa24690\" (UID: \"7ef3f28d-e496-434e-a803-3b9a0fa24690\") " Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.667199 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-kube-api-access-pfdvw" (OuterVolumeSpecName: "kube-api-access-pfdvw") pod "7ef3f28d-e496-434e-a803-3b9a0fa24690" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690"). InnerVolumeSpecName "kube-api-access-pfdvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.667241 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7ef3f28d-e496-434e-a803-3b9a0fa24690" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.667536 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7ef3f28d-e496-434e-a803-3b9a0fa24690" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.667863 4839 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.667880 4839 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.667894 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfdvw\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-kube-api-access-pfdvw\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.667904 4839 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.669293 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7ef3f28d-e496-434e-a803-3b9a0fa24690" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.674293 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7ef3f28d-e496-434e-a803-3b9a0fa24690" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.678040 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef3f28d-e496-434e-a803-3b9a0fa24690-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7ef3f28d-e496-434e-a803-3b9a0fa24690" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.678725 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ef3f28d-e496-434e-a803-3b9a0fa24690-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7ef3f28d-e496-434e-a803-3b9a0fa24690" (UID: "7ef3f28d-e496-434e-a803-3b9a0fa24690"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.769862 4839 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7ef3f28d-e496-434e-a803-3b9a0fa24690-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.769901 4839 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7ef3f28d-e496-434e-a803-3b9a0fa24690-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:36 crc kubenswrapper[4839]: I0321 04:31:36.769943 4839 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7ef3f28d-e496-434e-a803-3b9a0fa24690-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 21 04:31:37 crc kubenswrapper[4839]: I0321 04:31:37.491876 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" event={"ID":"7ef3f28d-e496-434e-a803-3b9a0fa24690","Type":"ContainerDied","Data":"db289ed2561962adc1edb7c7cc7d0a2aafe884fed424734dbdd27242d856949f"} Mar 21 04:31:37 crc kubenswrapper[4839]: I0321 04:31:37.491942 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ql2ps" Mar 21 04:31:37 crc kubenswrapper[4839]: I0321 04:31:37.492192 4839 scope.go:117] "RemoveContainer" containerID="0b216795d8b50fc395a781f55afc6bd2e9902da0332fa52d6ee539b16a4c0446" Mar 21 04:31:37 crc kubenswrapper[4839]: I0321 04:31:37.526761 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ql2ps"] Mar 21 04:31:37 crc kubenswrapper[4839]: I0321 04:31:37.535262 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ql2ps"] Mar 21 04:31:38 crc kubenswrapper[4839]: I0321 04:31:38.459638 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ef3f28d-e496-434e-a803-3b9a0fa24690" path="/var/lib/kubelet/pods/7ef3f28d-e496-434e-a803-3b9a0fa24690/volumes" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.141386 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567792-rhj6k"] Mar 21 04:32:00 crc kubenswrapper[4839]: E0321 04:32:00.142288 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef3f28d-e496-434e-a803-3b9a0fa24690" containerName="registry" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.142301 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef3f28d-e496-434e-a803-3b9a0fa24690" containerName="registry" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.142424 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef3f28d-e496-434e-a803-3b9a0fa24690" containerName="registry" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.142941 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567792-rhj6k" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.145935 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.146119 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.146279 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.149032 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567792-rhj6k"] Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.176444 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxgj4\" (UniqueName: \"kubernetes.io/projected/fbe4754f-40a1-43e0-827f-557507a5e7d1-kube-api-access-lxgj4\") pod \"auto-csr-approver-29567792-rhj6k\" (UID: \"fbe4754f-40a1-43e0-827f-557507a5e7d1\") " pod="openshift-infra/auto-csr-approver-29567792-rhj6k" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.278144 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxgj4\" (UniqueName: \"kubernetes.io/projected/fbe4754f-40a1-43e0-827f-557507a5e7d1-kube-api-access-lxgj4\") pod \"auto-csr-approver-29567792-rhj6k\" (UID: \"fbe4754f-40a1-43e0-827f-557507a5e7d1\") " pod="openshift-infra/auto-csr-approver-29567792-rhj6k" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.298675 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxgj4\" (UniqueName: \"kubernetes.io/projected/fbe4754f-40a1-43e0-827f-557507a5e7d1-kube-api-access-lxgj4\") pod \"auto-csr-approver-29567792-rhj6k\" (UID: \"fbe4754f-40a1-43e0-827f-557507a5e7d1\") " pod="openshift-infra/auto-csr-approver-29567792-rhj6k" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.496218 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567792-rhj6k" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.890529 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567792-rhj6k"] Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.980635 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.980897 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.980945 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.981477 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e09fc13ebec75e4a854ca3cecb49f40ab8a65cb0b655c2368ba9c14be11281c6"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:32:00 crc kubenswrapper[4839]: I0321 04:32:00.981534 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://e09fc13ebec75e4a854ca3cecb49f40ab8a65cb0b655c2368ba9c14be11281c6" gracePeriod=600 Mar 21 04:32:01 crc kubenswrapper[4839]: I0321 04:32:01.635715 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="e09fc13ebec75e4a854ca3cecb49f40ab8a65cb0b655c2368ba9c14be11281c6" exitCode=0 Mar 21 04:32:01 crc kubenswrapper[4839]: I0321 04:32:01.635837 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"e09fc13ebec75e4a854ca3cecb49f40ab8a65cb0b655c2368ba9c14be11281c6"} Mar 21 04:32:01 crc kubenswrapper[4839]: I0321 04:32:01.636126 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"da4c2d3dcbc2429432cc1a9b7a706caf5c1cde12d0441535caf710ea73866018"} Mar 21 04:32:01 crc kubenswrapper[4839]: I0321 04:32:01.636156 4839 scope.go:117] "RemoveContainer" containerID="a124ddc90d8b93fe4b6f77b778fbdac127b4896f5f55e6d5d78629cd17584311" Mar 21 04:32:01 crc kubenswrapper[4839]: I0321 04:32:01.637887 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567792-rhj6k" event={"ID":"fbe4754f-40a1-43e0-827f-557507a5e7d1","Type":"ContainerStarted","Data":"df484df4adc4b9da9eb920fd0aca59833fdf5b6b0a24afdf6a74a12ba2c545d1"} Mar 21 04:32:02 crc kubenswrapper[4839]: I0321 04:32:02.649971 4839 generic.go:334] "Generic (PLEG): container finished" podID="fbe4754f-40a1-43e0-827f-557507a5e7d1" containerID="c106b5183e83a440589571433cb66f6749e926bbac60bb184fac0a05ac6cf93b" exitCode=0 Mar 21 04:32:02 crc kubenswrapper[4839]: I0321 04:32:02.650503 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567792-rhj6k" event={"ID":"fbe4754f-40a1-43e0-827f-557507a5e7d1","Type":"ContainerDied","Data":"c106b5183e83a440589571433cb66f6749e926bbac60bb184fac0a05ac6cf93b"} Mar 21 04:32:03 crc kubenswrapper[4839]: I0321 04:32:03.920505 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567792-rhj6k" Mar 21 04:32:03 crc kubenswrapper[4839]: I0321 04:32:03.939547 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxgj4\" (UniqueName: \"kubernetes.io/projected/fbe4754f-40a1-43e0-827f-557507a5e7d1-kube-api-access-lxgj4\") pod \"fbe4754f-40a1-43e0-827f-557507a5e7d1\" (UID: \"fbe4754f-40a1-43e0-827f-557507a5e7d1\") " Mar 21 04:32:03 crc kubenswrapper[4839]: I0321 04:32:03.945994 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbe4754f-40a1-43e0-827f-557507a5e7d1-kube-api-access-lxgj4" (OuterVolumeSpecName: "kube-api-access-lxgj4") pod "fbe4754f-40a1-43e0-827f-557507a5e7d1" (UID: "fbe4754f-40a1-43e0-827f-557507a5e7d1"). InnerVolumeSpecName "kube-api-access-lxgj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:32:04 crc kubenswrapper[4839]: I0321 04:32:04.040783 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxgj4\" (UniqueName: \"kubernetes.io/projected/fbe4754f-40a1-43e0-827f-557507a5e7d1-kube-api-access-lxgj4\") on node \"crc\" DevicePath \"\"" Mar 21 04:32:04 crc kubenswrapper[4839]: I0321 04:32:04.665845 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567792-rhj6k" event={"ID":"fbe4754f-40a1-43e0-827f-557507a5e7d1","Type":"ContainerDied","Data":"df484df4adc4b9da9eb920fd0aca59833fdf5b6b0a24afdf6a74a12ba2c545d1"} Mar 21 04:32:04 crc kubenswrapper[4839]: I0321 04:32:04.665895 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df484df4adc4b9da9eb920fd0aca59833fdf5b6b0a24afdf6a74a12ba2c545d1" Mar 21 04:32:04 crc kubenswrapper[4839]: I0321 04:32:04.665953 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567792-rhj6k" Mar 21 04:32:04 crc kubenswrapper[4839]: I0321 04:32:04.976764 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567786-d8w8k"] Mar 21 04:32:04 crc kubenswrapper[4839]: I0321 04:32:04.979680 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567786-d8w8k"] Mar 21 04:32:06 crc kubenswrapper[4839]: I0321 04:32:06.461733 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="609ace61-45d1-44f6-b378-fb97eecf2374" path="/var/lib/kubelet/pods/609ace61-45d1-44f6-b378-fb97eecf2374/volumes" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.149381 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567794-rclnt"] Mar 21 04:34:00 crc kubenswrapper[4839]: E0321 04:34:00.150440 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe4754f-40a1-43e0-827f-557507a5e7d1" containerName="oc" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.150468 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe4754f-40a1-43e0-827f-557507a5e7d1" containerName="oc" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.150760 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe4754f-40a1-43e0-827f-557507a5e7d1" containerName="oc" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.151466 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567794-rclnt" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.155772 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.156224 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.156878 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567794-rclnt"] Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.158648 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.162965 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcjv9\" (UniqueName: \"kubernetes.io/projected/2dfa2356-3aca-4ed1-bfce-93cc8857825d-kube-api-access-zcjv9\") pod \"auto-csr-approver-29567794-rclnt\" (UID: \"2dfa2356-3aca-4ed1-bfce-93cc8857825d\") " pod="openshift-infra/auto-csr-approver-29567794-rclnt" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.264695 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcjv9\" (UniqueName: \"kubernetes.io/projected/2dfa2356-3aca-4ed1-bfce-93cc8857825d-kube-api-access-zcjv9\") pod \"auto-csr-approver-29567794-rclnt\" (UID: \"2dfa2356-3aca-4ed1-bfce-93cc8857825d\") " pod="openshift-infra/auto-csr-approver-29567794-rclnt" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.283753 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcjv9\" (UniqueName: \"kubernetes.io/projected/2dfa2356-3aca-4ed1-bfce-93cc8857825d-kube-api-access-zcjv9\") pod \"auto-csr-approver-29567794-rclnt\" (UID: \"2dfa2356-3aca-4ed1-bfce-93cc8857825d\") " pod="openshift-infra/auto-csr-approver-29567794-rclnt" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.483189 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567794-rclnt" Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.904615 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567794-rclnt"] Mar 21 04:34:00 crc kubenswrapper[4839]: I0321 04:34:00.918805 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 04:34:01 crc kubenswrapper[4839]: I0321 04:34:01.370691 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567794-rclnt" event={"ID":"2dfa2356-3aca-4ed1-bfce-93cc8857825d","Type":"ContainerStarted","Data":"adf709a089479709a5e89dbbee5413cabcfdbc11eac14be8289d76ab28d549e1"} Mar 21 04:34:02 crc kubenswrapper[4839]: I0321 04:34:02.378305 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567794-rclnt" event={"ID":"2dfa2356-3aca-4ed1-bfce-93cc8857825d","Type":"ContainerStarted","Data":"28332a1cde28bd0485ad3577e9e484f03df8597359b74118b7173ff71df9e89d"} Mar 21 04:34:02 crc kubenswrapper[4839]: I0321 04:34:02.390241 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567794-rclnt" podStartSLOduration=1.4019920510000001 podStartE2EDuration="2.390223858s" podCreationTimestamp="2026-03-21 04:34:00 +0000 UTC" firstStartedPulling="2026-03-21 04:34:00.917240398 +0000 UTC m=+645.245027104" lastFinishedPulling="2026-03-21 04:34:01.905472205 +0000 UTC m=+646.233258911" observedRunningTime="2026-03-21 04:34:02.389614701 +0000 UTC m=+646.717401377" watchObservedRunningTime="2026-03-21 04:34:02.390223858 +0000 UTC m=+646.718010534" Mar 21 04:34:03 crc kubenswrapper[4839]: I0321 04:34:03.386453 4839 generic.go:334] "Generic (PLEG): container finished" podID="2dfa2356-3aca-4ed1-bfce-93cc8857825d" containerID="28332a1cde28bd0485ad3577e9e484f03df8597359b74118b7173ff71df9e89d" exitCode=0 Mar 21 04:34:03 crc kubenswrapper[4839]: I0321 04:34:03.386502 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567794-rclnt" event={"ID":"2dfa2356-3aca-4ed1-bfce-93cc8857825d","Type":"ContainerDied","Data":"28332a1cde28bd0485ad3577e9e484f03df8597359b74118b7173ff71df9e89d"} Mar 21 04:34:04 crc kubenswrapper[4839]: I0321 04:34:04.622781 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567794-rclnt" Mar 21 04:34:04 crc kubenswrapper[4839]: I0321 04:34:04.726082 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcjv9\" (UniqueName: \"kubernetes.io/projected/2dfa2356-3aca-4ed1-bfce-93cc8857825d-kube-api-access-zcjv9\") pod \"2dfa2356-3aca-4ed1-bfce-93cc8857825d\" (UID: \"2dfa2356-3aca-4ed1-bfce-93cc8857825d\") " Mar 21 04:34:04 crc kubenswrapper[4839]: I0321 04:34:04.732559 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dfa2356-3aca-4ed1-bfce-93cc8857825d-kube-api-access-zcjv9" (OuterVolumeSpecName: "kube-api-access-zcjv9") pod "2dfa2356-3aca-4ed1-bfce-93cc8857825d" (UID: "2dfa2356-3aca-4ed1-bfce-93cc8857825d"). InnerVolumeSpecName "kube-api-access-zcjv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:34:04 crc kubenswrapper[4839]: I0321 04:34:04.827311 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcjv9\" (UniqueName: \"kubernetes.io/projected/2dfa2356-3aca-4ed1-bfce-93cc8857825d-kube-api-access-zcjv9\") on node \"crc\" DevicePath \"\"" Mar 21 04:34:05 crc kubenswrapper[4839]: I0321 04:34:05.399564 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567794-rclnt" event={"ID":"2dfa2356-3aca-4ed1-bfce-93cc8857825d","Type":"ContainerDied","Data":"adf709a089479709a5e89dbbee5413cabcfdbc11eac14be8289d76ab28d549e1"} Mar 21 04:34:05 crc kubenswrapper[4839]: I0321 04:34:05.399948 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adf709a089479709a5e89dbbee5413cabcfdbc11eac14be8289d76ab28d549e1" Mar 21 04:34:05 crc kubenswrapper[4839]: I0321 04:34:05.399786 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567794-rclnt" Mar 21 04:34:05 crc kubenswrapper[4839]: I0321 04:34:05.445724 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567788-9snlp"] Mar 21 04:34:05 crc kubenswrapper[4839]: I0321 04:34:05.448700 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567788-9snlp"] Mar 21 04:34:06 crc kubenswrapper[4839]: I0321 04:34:06.467514 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a45deb0c-4247-4d23-86db-a897c7f7e7f2" path="/var/lib/kubelet/pods/a45deb0c-4247-4d23-86db-a897c7f7e7f2/volumes" Mar 21 04:34:29 crc kubenswrapper[4839]: I0321 04:34:29.860091 4839 scope.go:117] "RemoveContainer" containerID="4d013e774070ce075bd0baa030b45d638ec14fab41990f0c671aa0d311846927" Mar 21 04:34:29 crc kubenswrapper[4839]: I0321 04:34:29.901849 4839 scope.go:117] "RemoveContainer" containerID="de6f2a80d57a636d18226b6f51d6ae0c6746d29df097ca4fd364524695c212fc" Mar 21 04:34:30 crc kubenswrapper[4839]: I0321 04:34:30.981216 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:34:30 crc kubenswrapper[4839]: I0321 04:34:30.981280 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:35:00 crc kubenswrapper[4839]: I0321 04:35:00.980355 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:35:00 crc kubenswrapper[4839]: I0321 04:35:00.981225 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:35:30 crc kubenswrapper[4839]: I0321 04:35:30.980014 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:35:30 crc kubenswrapper[4839]: I0321 04:35:30.980598 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:35:30 crc kubenswrapper[4839]: I0321 04:35:30.980660 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:35:30 crc kubenswrapper[4839]: I0321 04:35:30.981451 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"da4c2d3dcbc2429432cc1a9b7a706caf5c1cde12d0441535caf710ea73866018"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:35:30 crc kubenswrapper[4839]: I0321 04:35:30.981530 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://da4c2d3dcbc2429432cc1a9b7a706caf5c1cde12d0441535caf710ea73866018" gracePeriod=600 Mar 21 04:35:31 crc kubenswrapper[4839]: I0321 04:35:31.884782 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="da4c2d3dcbc2429432cc1a9b7a706caf5c1cde12d0441535caf710ea73866018" exitCode=0 Mar 21 04:35:31 crc kubenswrapper[4839]: I0321 04:35:31.884871 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"da4c2d3dcbc2429432cc1a9b7a706caf5c1cde12d0441535caf710ea73866018"} Mar 21 04:35:31 crc kubenswrapper[4839]: I0321 04:35:31.885369 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"d9f640234dbdc5d617b2a0974e24e968076d94c55d65466d46d7d064392afc02"} Mar 21 04:35:31 crc kubenswrapper[4839]: I0321 04:35:31.885393 4839 scope.go:117] "RemoveContainer" containerID="e09fc13ebec75e4a854ca3cecb49f40ab8a65cb0b655c2368ba9c14be11281c6" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.131794 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567796-c5w5j"] Mar 21 04:36:00 crc kubenswrapper[4839]: E0321 04:36:00.132593 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dfa2356-3aca-4ed1-bfce-93cc8857825d" containerName="oc" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.132608 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dfa2356-3aca-4ed1-bfce-93cc8857825d" containerName="oc" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.132734 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dfa2356-3aca-4ed1-bfce-93cc8857825d" containerName="oc" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.133151 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567796-c5w5j" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.136388 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.136704 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.136954 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.151135 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567796-c5w5j"] Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.287547 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tcnw\" (UniqueName: \"kubernetes.io/projected/c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b-kube-api-access-8tcnw\") pod \"auto-csr-approver-29567796-c5w5j\" (UID: \"c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b\") " pod="openshift-infra/auto-csr-approver-29567796-c5w5j" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.388977 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tcnw\" (UniqueName: \"kubernetes.io/projected/c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b-kube-api-access-8tcnw\") pod \"auto-csr-approver-29567796-c5w5j\" (UID: \"c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b\") " pod="openshift-infra/auto-csr-approver-29567796-c5w5j" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.412765 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tcnw\" (UniqueName: \"kubernetes.io/projected/c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b-kube-api-access-8tcnw\") pod \"auto-csr-approver-29567796-c5w5j\" (UID: \"c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b\") " pod="openshift-infra/auto-csr-approver-29567796-c5w5j" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.458038 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567796-c5w5j" Mar 21 04:36:00 crc kubenswrapper[4839]: I0321 04:36:00.835546 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567796-c5w5j"] Mar 21 04:36:01 crc kubenswrapper[4839]: I0321 04:36:01.049823 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567796-c5w5j" event={"ID":"c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b","Type":"ContainerStarted","Data":"81b430d783f9d19437394ba6f18815c98c399b02dbf007aedb78110cae8ef789"} Mar 21 04:36:03 crc kubenswrapper[4839]: I0321 04:36:03.061538 4839 generic.go:334] "Generic (PLEG): container finished" podID="c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b" containerID="edf0b9b310ad11f4cb21b959eb633d808203a45ec2b8463a2fe875186e107484" exitCode=0 Mar 21 04:36:03 crc kubenswrapper[4839]: I0321 04:36:03.061642 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567796-c5w5j" event={"ID":"c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b","Type":"ContainerDied","Data":"edf0b9b310ad11f4cb21b959eb633d808203a45ec2b8463a2fe875186e107484"} Mar 21 04:36:04 crc kubenswrapper[4839]: I0321 04:36:04.324456 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567796-c5w5j" Mar 21 04:36:04 crc kubenswrapper[4839]: I0321 04:36:04.443294 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tcnw\" (UniqueName: \"kubernetes.io/projected/c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b-kube-api-access-8tcnw\") pod \"c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b\" (UID: \"c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b\") " Mar 21 04:36:04 crc kubenswrapper[4839]: I0321 04:36:04.449076 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b-kube-api-access-8tcnw" (OuterVolumeSpecName: "kube-api-access-8tcnw") pod "c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b" (UID: "c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b"). InnerVolumeSpecName "kube-api-access-8tcnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:36:04 crc kubenswrapper[4839]: I0321 04:36:04.545172 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tcnw\" (UniqueName: \"kubernetes.io/projected/c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b-kube-api-access-8tcnw\") on node \"crc\" DevicePath \"\"" Mar 21 04:36:05 crc kubenswrapper[4839]: I0321 04:36:05.075915 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567796-c5w5j" event={"ID":"c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b","Type":"ContainerDied","Data":"81b430d783f9d19437394ba6f18815c98c399b02dbf007aedb78110cae8ef789"} Mar 21 04:36:05 crc kubenswrapper[4839]: I0321 04:36:05.075972 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81b430d783f9d19437394ba6f18815c98c399b02dbf007aedb78110cae8ef789" Mar 21 04:36:05 crc kubenswrapper[4839]: I0321 04:36:05.076019 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567796-c5w5j" Mar 21 04:36:05 crc kubenswrapper[4839]: I0321 04:36:05.386013 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567790-h7nhz"] Mar 21 04:36:05 crc kubenswrapper[4839]: I0321 04:36:05.390431 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567790-h7nhz"] Mar 21 04:36:06 crc kubenswrapper[4839]: I0321 04:36:06.458289 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64e6efc9-03ce-4af4-bcc2-bc64ceebc652" path="/var/lib/kubelet/pods/64e6efc9-03ce-4af4-bcc2-bc64ceebc652/volumes" Mar 21 04:36:29 crc kubenswrapper[4839]: I0321 04:36:29.971226 4839 scope.go:117] "RemoveContainer" containerID="7a160fd6d3c601d634e7f0ddbce27e4379f3d0fc66482e35f835bfe3e44b6c2b" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.911422 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-v297k"] Mar 21 04:36:49 crc kubenswrapper[4839]: E0321 04:36:49.913398 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b" containerName="oc" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.913414 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b" containerName="oc" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.913532 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b" containerName="oc" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.914716 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-v297k" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.918616 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.918994 4839 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-59dq7" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.924335 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-x2cpt"] Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.925123 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-x2cpt" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.925553 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.929485 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-v297k"] Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.939398 4839 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-8lbcs" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.947839 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-x2cpt"] Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.953616 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-s9zj6"] Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.954470 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-s9zj6" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.956165 4839 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-zw6m8" Mar 21 04:36:49 crc kubenswrapper[4839]: I0321 04:36:49.968829 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-s9zj6"] Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.102727 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrtvn\" (UniqueName: \"kubernetes.io/projected/daed7a16-7023-463e-9d60-3f56f091f73e-kube-api-access-qrtvn\") pod \"cert-manager-858654f9db-x2cpt\" (UID: \"daed7a16-7023-463e-9d60-3f56f091f73e\") " pod="cert-manager/cert-manager-858654f9db-x2cpt" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.102809 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfkbz\" (UniqueName: \"kubernetes.io/projected/d70f5b8f-f5a8-4829-b4e1-7a7a12dddd1f-kube-api-access-tfkbz\") pod \"cert-manager-webhook-687f57d79b-s9zj6\" (UID: \"d70f5b8f-f5a8-4829-b4e1-7a7a12dddd1f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-s9zj6" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.102922 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdlqn\" (UniqueName: \"kubernetes.io/projected/814a91ac-5e2f-4479-88a3-254e4216e50c-kube-api-access-sdlqn\") pod \"cert-manager-cainjector-cf98fcc89-v297k\" (UID: \"814a91ac-5e2f-4479-88a3-254e4216e50c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-v297k" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.204520 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrtvn\" (UniqueName: \"kubernetes.io/projected/daed7a16-7023-463e-9d60-3f56f091f73e-kube-api-access-qrtvn\") pod \"cert-manager-858654f9db-x2cpt\" (UID: \"daed7a16-7023-463e-9d60-3f56f091f73e\") " pod="cert-manager/cert-manager-858654f9db-x2cpt" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.204603 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfkbz\" (UniqueName: \"kubernetes.io/projected/d70f5b8f-f5a8-4829-b4e1-7a7a12dddd1f-kube-api-access-tfkbz\") pod \"cert-manager-webhook-687f57d79b-s9zj6\" (UID: \"d70f5b8f-f5a8-4829-b4e1-7a7a12dddd1f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-s9zj6" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.204673 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdlqn\" (UniqueName: \"kubernetes.io/projected/814a91ac-5e2f-4479-88a3-254e4216e50c-kube-api-access-sdlqn\") pod \"cert-manager-cainjector-cf98fcc89-v297k\" (UID: \"814a91ac-5e2f-4479-88a3-254e4216e50c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-v297k" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.231455 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrtvn\" (UniqueName: \"kubernetes.io/projected/daed7a16-7023-463e-9d60-3f56f091f73e-kube-api-access-qrtvn\") pod \"cert-manager-858654f9db-x2cpt\" (UID: \"daed7a16-7023-463e-9d60-3f56f091f73e\") " pod="cert-manager/cert-manager-858654f9db-x2cpt" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.232592 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdlqn\" (UniqueName: \"kubernetes.io/projected/814a91ac-5e2f-4479-88a3-254e4216e50c-kube-api-access-sdlqn\") pod \"cert-manager-cainjector-cf98fcc89-v297k\" (UID: \"814a91ac-5e2f-4479-88a3-254e4216e50c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-v297k" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.233399 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfkbz\" (UniqueName: \"kubernetes.io/projected/d70f5b8f-f5a8-4829-b4e1-7a7a12dddd1f-kube-api-access-tfkbz\") pod \"cert-manager-webhook-687f57d79b-s9zj6\" (UID: \"d70f5b8f-f5a8-4829-b4e1-7a7a12dddd1f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-s9zj6" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.250478 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-v297k" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.263209 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-x2cpt" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.272917 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-s9zj6" Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.676876 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-x2cpt"] Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.683049 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-s9zj6"] Mar 21 04:36:50 crc kubenswrapper[4839]: W0321 04:36:50.690593 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddaed7a16_7023_463e_9d60_3f56f091f73e.slice/crio-71bfec46c62749530b458ffa7a5e2d59a383e1598aef62ecfd57fd55bc7fb110 WatchSource:0}: Error finding container 71bfec46c62749530b458ffa7a5e2d59a383e1598aef62ecfd57fd55bc7fb110: Status 404 returned error can't find the container with id 71bfec46c62749530b458ffa7a5e2d59a383e1598aef62ecfd57fd55bc7fb110 Mar 21 04:36:50 crc kubenswrapper[4839]: I0321 04:36:50.696252 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-v297k"] Mar 21 04:36:50 crc kubenswrapper[4839]: W0321 04:36:50.703418 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd70f5b8f_f5a8_4829_b4e1_7a7a12dddd1f.slice/crio-9c2e27f3df9f328e368b776dfb590ef3e58fdb0406efbeba9904be4426d36b5a WatchSource:0}: Error finding container 9c2e27f3df9f328e368b776dfb590ef3e58fdb0406efbeba9904be4426d36b5a: Status 404 returned error can't find the container with id 9c2e27f3df9f328e368b776dfb590ef3e58fdb0406efbeba9904be4426d36b5a Mar 21 04:36:51 crc kubenswrapper[4839]: I0321 04:36:51.355778 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-s9zj6" event={"ID":"d70f5b8f-f5a8-4829-b4e1-7a7a12dddd1f","Type":"ContainerStarted","Data":"9c2e27f3df9f328e368b776dfb590ef3e58fdb0406efbeba9904be4426d36b5a"} Mar 21 04:36:51 crc kubenswrapper[4839]: I0321 04:36:51.357152 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-x2cpt" event={"ID":"daed7a16-7023-463e-9d60-3f56f091f73e","Type":"ContainerStarted","Data":"71bfec46c62749530b458ffa7a5e2d59a383e1598aef62ecfd57fd55bc7fb110"} Mar 21 04:36:51 crc kubenswrapper[4839]: I0321 04:36:51.358111 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-v297k" event={"ID":"814a91ac-5e2f-4479-88a3-254e4216e50c","Type":"ContainerStarted","Data":"816fae4ac38734a4473d5aa061bfa6f3a4f63f96f190b173b9adbcb2aaffd2a2"} Mar 21 04:36:55 crc kubenswrapper[4839]: I0321 04:36:55.386191 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-v297k" event={"ID":"814a91ac-5e2f-4479-88a3-254e4216e50c","Type":"ContainerStarted","Data":"58209ebe8c80a719c78fd621af2586a50b4ce6e9a54e4085cfd695e0fb52d176"} Mar 21 04:36:55 crc kubenswrapper[4839]: I0321 04:36:55.406013 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-v297k" podStartSLOduration=2.527279573 podStartE2EDuration="6.405997753s" podCreationTimestamp="2026-03-21 04:36:49 +0000 UTC" firstStartedPulling="2026-03-21 04:36:50.702671083 +0000 UTC m=+815.030457809" lastFinishedPulling="2026-03-21 04:36:54.581389303 +0000 UTC m=+818.909175989" observedRunningTime="2026-03-21 04:36:55.403860556 +0000 UTC m=+819.731647272" watchObservedRunningTime="2026-03-21 04:36:55.405997753 +0000 UTC m=+819.733784429" Mar 21 04:36:57 crc kubenswrapper[4839]: I0321 04:36:57.401447 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-s9zj6" event={"ID":"d70f5b8f-f5a8-4829-b4e1-7a7a12dddd1f","Type":"ContainerStarted","Data":"5e8efcf0912016e52ad90c06eb91b7c511365ffc59b290ed94cf862a643a434c"} Mar 21 04:36:57 crc kubenswrapper[4839]: I0321 04:36:57.402022 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-s9zj6" Mar 21 04:36:57 crc kubenswrapper[4839]: I0321 04:36:57.405378 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-x2cpt" event={"ID":"daed7a16-7023-463e-9d60-3f56f091f73e","Type":"ContainerStarted","Data":"361b894f501f07723005e385884369bf997646eee850b84b3d1ce3020fefe03c"} Mar 21 04:36:57 crc kubenswrapper[4839]: I0321 04:36:57.426711 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-s9zj6" podStartSLOduration=2.762408116 podStartE2EDuration="8.426683404s" podCreationTimestamp="2026-03-21 04:36:49 +0000 UTC" firstStartedPulling="2026-03-21 04:36:50.705631463 +0000 UTC m=+815.033418159" lastFinishedPulling="2026-03-21 04:36:56.369906771 +0000 UTC m=+820.697693447" observedRunningTime="2026-03-21 04:36:57.422784479 +0000 UTC m=+821.750571235" watchObservedRunningTime="2026-03-21 04:36:57.426683404 +0000 UTC m=+821.754470160" Mar 21 04:36:57 crc kubenswrapper[4839]: I0321 04:36:57.446303 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-x2cpt" podStartSLOduration=2.651085407 podStartE2EDuration="8.446276908s" podCreationTimestamp="2026-03-21 04:36:49 +0000 UTC" firstStartedPulling="2026-03-21 04:36:50.693910409 +0000 UTC m=+815.021697095" lastFinishedPulling="2026-03-21 04:36:56.48910192 +0000 UTC m=+820.816888596" observedRunningTime="2026-03-21 04:36:57.44447656 +0000 UTC m=+821.772263276" watchObservedRunningTime="2026-03-21 04:36:57.446276908 +0000 UTC m=+821.774063614" Mar 21 04:36:59 crc kubenswrapper[4839]: I0321 04:36:59.951645 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-spl4b"] Mar 21 04:36:59 crc kubenswrapper[4839]: I0321 04:36:59.952851 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovn-controller" containerID="cri-o://0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4" gracePeriod=30 Mar 21 04:36:59 crc kubenswrapper[4839]: I0321 04:36:59.952870 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="nbdb" containerID="cri-o://340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0" gracePeriod=30 Mar 21 04:36:59 crc kubenswrapper[4839]: I0321 04:36:59.952952 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="kube-rbac-proxy-node" containerID="cri-o://5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92" gracePeriod=30 Mar 21 04:36:59 crc kubenswrapper[4839]: I0321 04:36:59.953009 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="northd" containerID="cri-o://04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e" gracePeriod=30 Mar 21 04:36:59 crc kubenswrapper[4839]: I0321 04:36:59.953070 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovn-acl-logging" containerID="cri-o://821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e" gracePeriod=30 Mar 21 04:36:59 crc kubenswrapper[4839]: I0321 04:36:59.953088 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="sbdb" containerID="cri-o://ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536" gracePeriod=30 Mar 21 04:36:59 crc kubenswrapper[4839]: I0321 04:36:59.953121 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172" gracePeriod=30 Mar 21 04:36:59 crc kubenswrapper[4839]: I0321 04:36:59.988849 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" containerID="cri-o://06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e" gracePeriod=30 Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.246689 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/3.log" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.249634 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovn-acl-logging/0.log" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.250038 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovn-controller/0.log" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.250399 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309655 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vdfz5"] Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.309854 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309866 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.309874 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309880 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.309889 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="sbdb" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309894 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="sbdb" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.309904 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309909 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.309916 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovn-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309921 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovn-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.309932 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovn-acl-logging" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309937 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovn-acl-logging" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.309944 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="kube-rbac-proxy-ovn-metrics" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309949 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="kube-rbac-proxy-ovn-metrics" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.309959 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="kubecfg-setup" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309965 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="kubecfg-setup" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.309975 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="nbdb" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309981 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="nbdb" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.309991 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="kube-rbac-proxy-node" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.309998 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="kube-rbac-proxy-node" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.310004 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="northd" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310010 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="northd" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310105 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="northd" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310119 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="kube-rbac-proxy-ovn-metrics" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310126 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="sbdb" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310133 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310140 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="nbdb" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310146 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310153 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310161 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310170 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovn-acl-logging" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310177 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="kube-rbac-proxy-node" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310184 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovn-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.310284 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310291 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.310298 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310303 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.310388 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" containerName="ovnkube-controller" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.311871 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354041 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354108 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-config\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354160 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354204 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-ovn\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354272 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354274 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-kubelet\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354297 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354514 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-script-lib\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354543 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-openvswitch\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354560 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354604 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d634043b-c9ec-4469-b267-26053b1f02f9-ovn-node-metrics-cert\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354619 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354628 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-systemd\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354658 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-bin\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354680 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-slash\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354712 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-netns\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354736 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-node-log\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354748 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354763 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-slash" (OuterVolumeSpecName: "host-slash") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354769 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-log-socket\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354787 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354794 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-log-socket" (OuterVolumeSpecName: "log-socket") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354807 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-node-log" (OuterVolumeSpecName: "node-log") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354809 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-systemd-units\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354827 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354833 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-netd\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354856 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-ovn-kubernetes\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354882 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdph2\" (UniqueName: \"kubernetes.io/projected/d634043b-c9ec-4469-b267-26053b1f02f9-kube-api-access-cdph2\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354907 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-etc-openvswitch\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354929 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-var-lib-openvswitch\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354906 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354913 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354933 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354953 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-env-overrides\") pod \"d634043b-c9ec-4469-b267-26053b1f02f9\" (UID: \"d634043b-c9ec-4469-b267-26053b1f02f9\") " Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.354966 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355049 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355082 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-node-log\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355102 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d876p\" (UniqueName: \"kubernetes.io/projected/cb64f802-d294-4fd5-a691-da3096ee0978-kube-api-access-d876p\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355121 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-run-openvswitch\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355144 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-var-lib-openvswitch\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355167 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-run-ovn\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355190 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-kubelet\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355220 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-run-ovn-kubernetes\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355251 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355358 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-run-systemd\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355396 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-slash\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355433 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-etc-openvswitch\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355451 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-log-socket\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355471 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cb64f802-d294-4fd5-a691-da3096ee0978-env-overrides\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355509 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355545 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cb64f802-d294-4fd5-a691-da3096ee0978-ovnkube-script-lib\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355584 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-cni-bin\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355604 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cb64f802-d294-4fd5-a691-da3096ee0978-ovnkube-config\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355639 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-run-netns\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355691 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cb64f802-d294-4fd5-a691-da3096ee0978-ovn-node-metrics-cert\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355712 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-cni-netd\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355741 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-systemd-units\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355789 4839 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355800 4839 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355809 4839 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355818 4839 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355826 4839 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355834 4839 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355843 4839 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355851 4839 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355860 4839 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355877 4839 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355885 4839 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d634043b-c9ec-4469-b267-26053b1f02f9-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355893 4839 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355902 4839 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355909 4839 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-slash\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355917 4839 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355926 4839 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-node-log\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.355933 4839 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-log-socket\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.359640 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d634043b-c9ec-4469-b267-26053b1f02f9-kube-api-access-cdph2" (OuterVolumeSpecName: "kube-api-access-cdph2") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "kube-api-access-cdph2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.359902 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d634043b-c9ec-4469-b267-26053b1f02f9-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.370412 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d634043b-c9ec-4469-b267-26053b1f02f9" (UID: "d634043b-c9ec-4469-b267-26053b1f02f9"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.421954 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovnkube-controller/3.log" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.424769 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovn-acl-logging/0.log" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.425453 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-spl4b_d634043b-c9ec-4469-b267-26053b1f02f9/ovn-controller/0.log" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426045 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e" exitCode=0 Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426076 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536" exitCode=0 Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426087 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0" exitCode=0 Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426097 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e" exitCode=0 Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426107 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172" exitCode=0 Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426117 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92" exitCode=0 Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426125 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e" exitCode=143 Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426134 4839 generic.go:334] "Generic (PLEG): container finished" podID="d634043b-c9ec-4469-b267-26053b1f02f9" containerID="0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4" exitCode=143 Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426155 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426188 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426200 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426205 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426211 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426344 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426370 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426388 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426223 4839 scope.go:117] "RemoveContainer" containerID="06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426400 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426486 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426498 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426507 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426514 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426521 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426528 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426534 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426548 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426579 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426587 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426595 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426603 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426610 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426617 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426625 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426632 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426639 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426647 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426657 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426668 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426676 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426684 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426692 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426699 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426710 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426720 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426729 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426740 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426750 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426762 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-spl4b" event={"ID":"d634043b-c9ec-4469-b267-26053b1f02f9","Type":"ContainerDied","Data":"f75e324ef6ce35e2f9d2ecb83aad79d37f2471563f3c265cdfc0e67df74a76f1"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426776 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426788 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426798 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426807 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426815 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426822 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426829 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426836 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426843 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.426850 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.427657 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/2.log" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.428108 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/1.log" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.428138 4839 generic.go:334] "Generic (PLEG): container finished" podID="1602189b-f4f3-40ee-ba63-c695c11069d0" containerID="44c7b00e724e15bccb8ef54953306d49bc029cd21069ea40d7f724706be68de4" exitCode=2 Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.428164 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqcw4" event={"ID":"1602189b-f4f3-40ee-ba63-c695c11069d0","Type":"ContainerDied","Data":"44c7b00e724e15bccb8ef54953306d49bc029cd21069ea40d7f724706be68de4"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.428179 4839 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1"} Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.428621 4839 scope.go:117] "RemoveContainer" containerID="44c7b00e724e15bccb8ef54953306d49bc029cd21069ea40d7f724706be68de4" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.458301 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.461195 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.461282 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-node-log\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.461504 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d876p\" (UniqueName: \"kubernetes.io/projected/cb64f802-d294-4fd5-a691-da3096ee0978-kube-api-access-d876p\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.461508 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.461727 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-node-log\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.461749 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-run-openvswitch\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.461817 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-var-lib-openvswitch\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.461851 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-run-openvswitch\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.461953 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-var-lib-openvswitch\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.461870 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-run-ovn\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462063 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-kubelet\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462127 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-run-ovn\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462247 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-run-ovn-kubernetes\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462305 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-run-systemd\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462504 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-kubelet\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462523 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-slash\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462667 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-etc-openvswitch\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462695 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-slash\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462721 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-log-socket\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462763 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-run-systemd\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462523 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-run-ovn-kubernetes\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462828 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-etc-openvswitch\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462928 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cb64f802-d294-4fd5-a691-da3096ee0978-env-overrides\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.462977 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cb64f802-d294-4fd5-a691-da3096ee0978-ovnkube-script-lib\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.463068 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-cni-bin\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.463121 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cb64f802-d294-4fd5-a691-da3096ee0978-ovnkube-config\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.463247 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-run-netns\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.463068 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-log-socket\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.463301 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cb64f802-d294-4fd5-a691-da3096ee0978-ovn-node-metrics-cert\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.463407 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-cni-netd\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.463461 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-systemd-units\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.464282 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cb64f802-d294-4fd5-a691-da3096ee0978-env-overrides\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.464310 4839 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d634043b-c9ec-4469-b267-26053b1f02f9-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.464341 4839 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d634043b-c9ec-4469-b267-26053b1f02f9-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.464356 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdph2\" (UniqueName: \"kubernetes.io/projected/d634043b-c9ec-4469-b267-26053b1f02f9-kube-api-access-cdph2\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.464515 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-systemd-units\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.464604 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-cni-netd\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.464678 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-cni-bin\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.464755 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cb64f802-d294-4fd5-a691-da3096ee0978-host-run-netns\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.465095 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cb64f802-d294-4fd5-a691-da3096ee0978-ovnkube-config\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.465927 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cb64f802-d294-4fd5-a691-da3096ee0978-ovnkube-script-lib\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.471746 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cb64f802-d294-4fd5-a691-da3096ee0978-ovn-node-metrics-cert\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.487261 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d876p\" (UniqueName: \"kubernetes.io/projected/cb64f802-d294-4fd5-a691-da3096ee0978-kube-api-access-d876p\") pod \"ovnkube-node-vdfz5\" (UID: \"cb64f802-d294-4fd5-a691-da3096ee0978\") " pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.494673 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-spl4b"] Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.497141 4839 scope.go:117] "RemoveContainer" containerID="ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.499377 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-spl4b"] Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.513662 4839 scope.go:117] "RemoveContainer" containerID="340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.543895 4839 scope.go:117] "RemoveContainer" containerID="04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.556269 4839 scope.go:117] "RemoveContainer" containerID="c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.567823 4839 scope.go:117] "RemoveContainer" containerID="5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.581652 4839 scope.go:117] "RemoveContainer" containerID="821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.594647 4839 scope.go:117] "RemoveContainer" containerID="0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.606434 4839 scope.go:117] "RemoveContainer" containerID="466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.619090 4839 scope.go:117] "RemoveContainer" containerID="06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.619413 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": container with ID starting with 06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e not found: ID does not exist" containerID="06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.619445 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e"} err="failed to get container status \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": rpc error: code = NotFound desc = could not find container \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": container with ID starting with 06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.619466 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.619728 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\": container with ID starting with 616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f not found: ID does not exist" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.619749 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f"} err="failed to get container status \"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\": rpc error: code = NotFound desc = could not find container \"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\": container with ID starting with 616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.619763 4839 scope.go:117] "RemoveContainer" containerID="ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.620114 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\": container with ID starting with ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536 not found: ID does not exist" containerID="ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.620137 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536"} err="failed to get container status \"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\": rpc error: code = NotFound desc = could not find container \"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\": container with ID starting with ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.620150 4839 scope.go:117] "RemoveContainer" containerID="340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.620471 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\": container with ID starting with 340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0 not found: ID does not exist" containerID="340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.620489 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0"} err="failed to get container status \"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\": rpc error: code = NotFound desc = could not find container \"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\": container with ID starting with 340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.620501 4839 scope.go:117] "RemoveContainer" containerID="04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.620726 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\": container with ID starting with 04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e not found: ID does not exist" containerID="04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.620765 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e"} err="failed to get container status \"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\": rpc error: code = NotFound desc = could not find container \"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\": container with ID starting with 04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.620780 4839 scope.go:117] "RemoveContainer" containerID="c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.620947 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\": container with ID starting with c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172 not found: ID does not exist" containerID="c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.620967 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172"} err="failed to get container status \"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\": rpc error: code = NotFound desc = could not find container \"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\": container with ID starting with c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.620980 4839 scope.go:117] "RemoveContainer" containerID="5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.621236 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\": container with ID starting with 5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92 not found: ID does not exist" containerID="5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.621298 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92"} err="failed to get container status \"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\": rpc error: code = NotFound desc = could not find container \"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\": container with ID starting with 5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.621338 4839 scope.go:117] "RemoveContainer" containerID="821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.621699 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\": container with ID starting with 821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e not found: ID does not exist" containerID="821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.621719 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e"} err="failed to get container status \"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\": rpc error: code = NotFound desc = could not find container \"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\": container with ID starting with 821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.621732 4839 scope.go:117] "RemoveContainer" containerID="0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.621964 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\": container with ID starting with 0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4 not found: ID does not exist" containerID="0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.621983 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4"} err="failed to get container status \"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\": rpc error: code = NotFound desc = could not find container \"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\": container with ID starting with 0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.621996 4839 scope.go:117] "RemoveContainer" containerID="466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c" Mar 21 04:37:00 crc kubenswrapper[4839]: E0321 04:37:00.622282 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\": container with ID starting with 466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c not found: ID does not exist" containerID="466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.622321 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c"} err="failed to get container status \"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\": rpc error: code = NotFound desc = could not find container \"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\": container with ID starting with 466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.622346 4839 scope.go:117] "RemoveContainer" containerID="06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.622649 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e"} err="failed to get container status \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": rpc error: code = NotFound desc = could not find container \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": container with ID starting with 06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.622668 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.622918 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f"} err="failed to get container status \"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\": rpc error: code = NotFound desc = could not find container \"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\": container with ID starting with 616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.622936 4839 scope.go:117] "RemoveContainer" containerID="ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.623170 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536"} err="failed to get container status \"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\": rpc error: code = NotFound desc = could not find container \"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\": container with ID starting with ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.623193 4839 scope.go:117] "RemoveContainer" containerID="340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.623457 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0"} err="failed to get container status \"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\": rpc error: code = NotFound desc = could not find container \"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\": container with ID starting with 340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.623475 4839 scope.go:117] "RemoveContainer" containerID="04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.623546 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.623788 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e"} err="failed to get container status \"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\": rpc error: code = NotFound desc = could not find container \"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\": container with ID starting with 04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.623809 4839 scope.go:117] "RemoveContainer" containerID="c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.624119 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172"} err="failed to get container status \"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\": rpc error: code = NotFound desc = could not find container \"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\": container with ID starting with c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.624162 4839 scope.go:117] "RemoveContainer" containerID="5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.624530 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92"} err="failed to get container status \"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\": rpc error: code = NotFound desc = could not find container \"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\": container with ID starting with 5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.624548 4839 scope.go:117] "RemoveContainer" containerID="821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.624847 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e"} err="failed to get container status \"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\": rpc error: code = NotFound desc = could not find container \"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\": container with ID starting with 821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.624918 4839 scope.go:117] "RemoveContainer" containerID="0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.625175 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4"} err="failed to get container status \"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\": rpc error: code = NotFound desc = could not find container \"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\": container with ID starting with 0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.625196 4839 scope.go:117] "RemoveContainer" containerID="466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.625504 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c"} err="failed to get container status \"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\": rpc error: code = NotFound desc = could not find container \"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\": container with ID starting with 466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.625531 4839 scope.go:117] "RemoveContainer" containerID="06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.625857 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e"} err="failed to get container status \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": rpc error: code = NotFound desc = could not find container \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": container with ID starting with 06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.625878 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.626092 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f"} err="failed to get container status \"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\": rpc error: code = NotFound desc = could not find container \"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\": container with ID starting with 616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.626128 4839 scope.go:117] "RemoveContainer" containerID="ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.626395 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536"} err="failed to get container status \"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\": rpc error: code = NotFound desc = could not find container \"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\": container with ID starting with ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.626439 4839 scope.go:117] "RemoveContainer" containerID="340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.626760 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0"} err="failed to get container status \"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\": rpc error: code = NotFound desc = could not find container \"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\": container with ID starting with 340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.626789 4839 scope.go:117] "RemoveContainer" containerID="04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.627119 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e"} err="failed to get container status \"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\": rpc error: code = NotFound desc = could not find container \"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\": container with ID starting with 04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.627147 4839 scope.go:117] "RemoveContainer" containerID="c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.627422 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172"} err="failed to get container status \"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\": rpc error: code = NotFound desc = could not find container \"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\": container with ID starting with c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.627443 4839 scope.go:117] "RemoveContainer" containerID="5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.627793 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92"} err="failed to get container status \"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\": rpc error: code = NotFound desc = could not find container \"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\": container with ID starting with 5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.627815 4839 scope.go:117] "RemoveContainer" containerID="821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.628590 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e"} err="failed to get container status \"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\": rpc error: code = NotFound desc = could not find container \"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\": container with ID starting with 821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.628617 4839 scope.go:117] "RemoveContainer" containerID="0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.628897 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4"} err="failed to get container status \"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\": rpc error: code = NotFound desc = could not find container \"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\": container with ID starting with 0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.628917 4839 scope.go:117] "RemoveContainer" containerID="466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.629157 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c"} err="failed to get container status \"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\": rpc error: code = NotFound desc = could not find container \"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\": container with ID starting with 466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.629176 4839 scope.go:117] "RemoveContainer" containerID="06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.629446 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e"} err="failed to get container status \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": rpc error: code = NotFound desc = could not find container \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": container with ID starting with 06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.629465 4839 scope.go:117] "RemoveContainer" containerID="616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.629739 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f"} err="failed to get container status \"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\": rpc error: code = NotFound desc = could not find container \"616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f\": container with ID starting with 616e01984d5c14dee369d9f7e59f96c9dfaccfd431fc19972454ffbbb0b5489f not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.629757 4839 scope.go:117] "RemoveContainer" containerID="ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.630018 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536"} err="failed to get container status \"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\": rpc error: code = NotFound desc = could not find container \"ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536\": container with ID starting with ee9c34719e2681860a7065870d4f0fd1d5aae10da5f2cc780e83f8020211e536 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.630038 4839 scope.go:117] "RemoveContainer" containerID="340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.630342 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0"} err="failed to get container status \"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\": rpc error: code = NotFound desc = could not find container \"340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0\": container with ID starting with 340ce870dbfbd3b102f8f1c5e6705da80e4a23db035175d7746ffe6ad63310f0 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.630361 4839 scope.go:117] "RemoveContainer" containerID="04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.630589 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e"} err="failed to get container status \"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\": rpc error: code = NotFound desc = could not find container \"04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e\": container with ID starting with 04c0470b8aa154ad901d5416631136ad2480da45bff39836dda6837a3e4b980e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.630608 4839 scope.go:117] "RemoveContainer" containerID="c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.630995 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172"} err="failed to get container status \"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\": rpc error: code = NotFound desc = could not find container \"c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172\": container with ID starting with c532a8668c47301ab93dbb1ea21be6ed687a83a6c9008b887e5023a46f27d172 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.631011 4839 scope.go:117] "RemoveContainer" containerID="5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.631275 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92"} err="failed to get container status \"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\": rpc error: code = NotFound desc = could not find container \"5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92\": container with ID starting with 5854b81091eb3920f499e2fb8bdb5e51afa5cf975397ad7d789603150fdafe92 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.631294 4839 scope.go:117] "RemoveContainer" containerID="821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.631603 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e"} err="failed to get container status \"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\": rpc error: code = NotFound desc = could not find container \"821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e\": container with ID starting with 821316a28b33f897df4ea4ae553ed0cbdd066ff220cc310af604bbe062e4b00e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.631622 4839 scope.go:117] "RemoveContainer" containerID="0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.631974 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4"} err="failed to get container status \"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\": rpc error: code = NotFound desc = could not find container \"0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4\": container with ID starting with 0555faac11cd1da1d5573074ee5d8704c1c4a7d8559996229bf242d9ed7290f4 not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.632002 4839 scope.go:117] "RemoveContainer" containerID="466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.632301 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c"} err="failed to get container status \"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\": rpc error: code = NotFound desc = could not find container \"466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c\": container with ID starting with 466ed122245008a86b4471250670e086e61c981de3a7a026461c7c2238c87f3c not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.632341 4839 scope.go:117] "RemoveContainer" containerID="06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e" Mar 21 04:37:00 crc kubenswrapper[4839]: I0321 04:37:00.632759 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e"} err="failed to get container status \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": rpc error: code = NotFound desc = could not find container \"06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e\": container with ID starting with 06ba1fe337b0b90d457f17a01536cb29904e3afa201457970ad79c949040543e not found: ID does not exist" Mar 21 04:37:00 crc kubenswrapper[4839]: W0321 04:37:00.641536 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb64f802_d294_4fd5_a691_da3096ee0978.slice/crio-a477abd48e8a183005c5fb2c96cbc3b3d90374b5e29c8c80ec6eb64b98380b2c WatchSource:0}: Error finding container a477abd48e8a183005c5fb2c96cbc3b3d90374b5e29c8c80ec6eb64b98380b2c: Status 404 returned error can't find the container with id a477abd48e8a183005c5fb2c96cbc3b3d90374b5e29c8c80ec6eb64b98380b2c Mar 21 04:37:01 crc kubenswrapper[4839]: I0321 04:37:01.438824 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/2.log" Mar 21 04:37:01 crc kubenswrapper[4839]: I0321 04:37:01.440285 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/1.log" Mar 21 04:37:01 crc kubenswrapper[4839]: I0321 04:37:01.440489 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zqcw4" event={"ID":"1602189b-f4f3-40ee-ba63-c695c11069d0","Type":"ContainerStarted","Data":"71df07ffacae9482203a8cabda642d7215f74bd8f6c84a260cf8febd9e078ba4"} Mar 21 04:37:01 crc kubenswrapper[4839]: I0321 04:37:01.443240 4839 generic.go:334] "Generic (PLEG): container finished" podID="cb64f802-d294-4fd5-a691-da3096ee0978" containerID="6b659d570cf01643372fe7dfe0d11fb064f2d21ec2dfdb605b3b0751a44a44db" exitCode=0 Mar 21 04:37:01 crc kubenswrapper[4839]: I0321 04:37:01.443299 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" event={"ID":"cb64f802-d294-4fd5-a691-da3096ee0978","Type":"ContainerDied","Data":"6b659d570cf01643372fe7dfe0d11fb064f2d21ec2dfdb605b3b0751a44a44db"} Mar 21 04:37:01 crc kubenswrapper[4839]: I0321 04:37:01.443344 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" event={"ID":"cb64f802-d294-4fd5-a691-da3096ee0978","Type":"ContainerStarted","Data":"a477abd48e8a183005c5fb2c96cbc3b3d90374b5e29c8c80ec6eb64b98380b2c"} Mar 21 04:37:02 crc kubenswrapper[4839]: I0321 04:37:02.451563 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" event={"ID":"cb64f802-d294-4fd5-a691-da3096ee0978","Type":"ContainerStarted","Data":"f879662c8aebd4b6919b577691c554116168a132db8b6705855aa831492be275"} Mar 21 04:37:02 crc kubenswrapper[4839]: I0321 04:37:02.460277 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d634043b-c9ec-4469-b267-26053b1f02f9" path="/var/lib/kubelet/pods/d634043b-c9ec-4469-b267-26053b1f02f9/volumes" Mar 21 04:37:02 crc kubenswrapper[4839]: I0321 04:37:02.461271 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" event={"ID":"cb64f802-d294-4fd5-a691-da3096ee0978","Type":"ContainerStarted","Data":"fed131455fa0580c3e4f7918efeb38a28bb7ec55c2dddce52627784de7a65f59"} Mar 21 04:37:02 crc kubenswrapper[4839]: I0321 04:37:02.461299 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" event={"ID":"cb64f802-d294-4fd5-a691-da3096ee0978","Type":"ContainerStarted","Data":"a6483341a255f761d85ad7aeb013462317460d06af975951231edc45f9e20aa1"} Mar 21 04:37:02 crc kubenswrapper[4839]: I0321 04:37:02.461323 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" event={"ID":"cb64f802-d294-4fd5-a691-da3096ee0978","Type":"ContainerStarted","Data":"ceb79d4636a8f9f85d22b1b5027dc86cd38b45a3cf957b7e8581665713d35d02"} Mar 21 04:37:02 crc kubenswrapper[4839]: I0321 04:37:02.461333 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" event={"ID":"cb64f802-d294-4fd5-a691-da3096ee0978","Type":"ContainerStarted","Data":"2ea140b265ee9f3cf33bac33aa856c4694af4dafd7b61dc901561c695b98234f"} Mar 21 04:37:02 crc kubenswrapper[4839]: I0321 04:37:02.461341 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" event={"ID":"cb64f802-d294-4fd5-a691-da3096ee0978","Type":"ContainerStarted","Data":"2f70a4e247f7d8c7918a238fe573958c42eada15b75225f677902d3d12197324"} Mar 21 04:37:05 crc kubenswrapper[4839]: I0321 04:37:05.276690 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-s9zj6" Mar 21 04:37:05 crc kubenswrapper[4839]: I0321 04:37:05.484033 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" event={"ID":"cb64f802-d294-4fd5-a691-da3096ee0978","Type":"ContainerStarted","Data":"1521c8728685fe015ce907663e6c1745e77d64e79ad395f301a027b893061cdb"} Mar 21 04:37:07 crc kubenswrapper[4839]: I0321 04:37:07.504173 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" event={"ID":"cb64f802-d294-4fd5-a691-da3096ee0978","Type":"ContainerStarted","Data":"e81ce74fdc8cf188c028ae5fd9384ef1daaec91296ec73394ea137308db19fe8"} Mar 21 04:37:07 crc kubenswrapper[4839]: I0321 04:37:07.504700 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:07 crc kubenswrapper[4839]: I0321 04:37:07.541612 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:07 crc kubenswrapper[4839]: I0321 04:37:07.555078 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" podStartSLOduration=7.555064002 podStartE2EDuration="7.555064002s" podCreationTimestamp="2026-03-21 04:37:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:37:07.550785307 +0000 UTC m=+831.878572003" watchObservedRunningTime="2026-03-21 04:37:07.555064002 +0000 UTC m=+831.882850668" Mar 21 04:37:08 crc kubenswrapper[4839]: I0321 04:37:08.508835 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:08 crc kubenswrapper[4839]: I0321 04:37:08.509091 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:08 crc kubenswrapper[4839]: I0321 04:37:08.536099 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:14 crc kubenswrapper[4839]: I0321 04:37:14.302880 4839 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 21 04:37:30 crc kubenswrapper[4839]: I0321 04:37:30.030783 4839 scope.go:117] "RemoveContainer" containerID="bc026a6498e53a6dc434660326fcbdb2323e59c0c7d6f9e5d768c3a347b614d1" Mar 21 04:37:30 crc kubenswrapper[4839]: I0321 04:37:30.634958 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zqcw4_1602189b-f4f3-40ee-ba63-c695c11069d0/kube-multus/2.log" Mar 21 04:37:30 crc kubenswrapper[4839]: I0321 04:37:30.645101 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vdfz5" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.353807 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km"] Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.355471 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.359916 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.365131 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km"] Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.485397 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.485449 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffk54\" (UniqueName: \"kubernetes.io/projected/e0eea72c-ae42-4ea4-a067-6ff3e853c081-kube-api-access-ffk54\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.485551 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.587246 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.587321 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffk54\" (UniqueName: \"kubernetes.io/projected/e0eea72c-ae42-4ea4-a067-6ff3e853c081-kube-api-access-ffk54\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.587437 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.587876 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.588845 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.605771 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffk54\" (UniqueName: \"kubernetes.io/projected/e0eea72c-ae42-4ea4-a067-6ff3e853c081-kube-api-access-ffk54\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.672702 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:46 crc kubenswrapper[4839]: I0321 04:37:46.847758 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km"] Mar 21 04:37:47 crc kubenswrapper[4839]: I0321 04:37:47.727365 4839 generic.go:334] "Generic (PLEG): container finished" podID="e0eea72c-ae42-4ea4-a067-6ff3e853c081" containerID="97f916ce8a710bdca34d1c2baa215bdf07d025722c2d6356867fad6af911dbeb" exitCode=0 Mar 21 04:37:47 crc kubenswrapper[4839]: I0321 04:37:47.727416 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" event={"ID":"e0eea72c-ae42-4ea4-a067-6ff3e853c081","Type":"ContainerDied","Data":"97f916ce8a710bdca34d1c2baa215bdf07d025722c2d6356867fad6af911dbeb"} Mar 21 04:37:47 crc kubenswrapper[4839]: I0321 04:37:47.727446 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" event={"ID":"e0eea72c-ae42-4ea4-a067-6ff3e853c081","Type":"ContainerStarted","Data":"a2995f0b972f1412efb332edf63f598b757d416c916a33749410e3cf43dc6a7a"} Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.639798 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2z7jr"] Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.641725 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.648757 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2z7jr"] Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.713984 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-catalog-content\") pod \"redhat-operators-2z7jr\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.714094 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-utilities\") pod \"redhat-operators-2z7jr\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.714168 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqkfx\" (UniqueName: \"kubernetes.io/projected/36c4ce7f-79eb-4f18-8573-f6900d7812fe-kube-api-access-gqkfx\") pod \"redhat-operators-2z7jr\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.815181 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-utilities\") pod \"redhat-operators-2z7jr\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.815268 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqkfx\" (UniqueName: \"kubernetes.io/projected/36c4ce7f-79eb-4f18-8573-f6900d7812fe-kube-api-access-gqkfx\") pod \"redhat-operators-2z7jr\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.815320 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-catalog-content\") pod \"redhat-operators-2z7jr\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.815598 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-utilities\") pod \"redhat-operators-2z7jr\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.815820 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-catalog-content\") pod \"redhat-operators-2z7jr\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.837702 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqkfx\" (UniqueName: \"kubernetes.io/projected/36c4ce7f-79eb-4f18-8573-f6900d7812fe-kube-api-access-gqkfx\") pod \"redhat-operators-2z7jr\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:48 crc kubenswrapper[4839]: I0321 04:37:48.963044 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:49 crc kubenswrapper[4839]: I0321 04:37:49.189422 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2z7jr"] Mar 21 04:37:49 crc kubenswrapper[4839]: I0321 04:37:49.739624 4839 generic.go:334] "Generic (PLEG): container finished" podID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerID="47364d415da3cf65d8eae0846016fd96e5b43e3a480f9d051b30b253d8dce58d" exitCode=0 Mar 21 04:37:49 crc kubenswrapper[4839]: I0321 04:37:49.739706 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2z7jr" event={"ID":"36c4ce7f-79eb-4f18-8573-f6900d7812fe","Type":"ContainerDied","Data":"47364d415da3cf65d8eae0846016fd96e5b43e3a480f9d051b30b253d8dce58d"} Mar 21 04:37:49 crc kubenswrapper[4839]: I0321 04:37:49.739736 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2z7jr" event={"ID":"36c4ce7f-79eb-4f18-8573-f6900d7812fe","Type":"ContainerStarted","Data":"6d41a78d0e49e188c99866f252414054ca009b7f7c05cfeefe8bbaeba6908685"} Mar 21 04:37:49 crc kubenswrapper[4839]: I0321 04:37:49.741977 4839 generic.go:334] "Generic (PLEG): container finished" podID="e0eea72c-ae42-4ea4-a067-6ff3e853c081" containerID="7f2f1955493b275d48d8bf9215382a18b7c71e8975e69015be47c6034da4c8e7" exitCode=0 Mar 21 04:37:49 crc kubenswrapper[4839]: I0321 04:37:49.742034 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" event={"ID":"e0eea72c-ae42-4ea4-a067-6ff3e853c081","Type":"ContainerDied","Data":"7f2f1955493b275d48d8bf9215382a18b7c71e8975e69015be47c6034da4c8e7"} Mar 21 04:37:50 crc kubenswrapper[4839]: I0321 04:37:50.751173 4839 generic.go:334] "Generic (PLEG): container finished" podID="e0eea72c-ae42-4ea4-a067-6ff3e853c081" containerID="a71f8e9a655fe731cfee84e65c0d6b9444ddc523531becc69fa5e438aece2bc6" exitCode=0 Mar 21 04:37:50 crc kubenswrapper[4839]: I0321 04:37:50.751241 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" event={"ID":"e0eea72c-ae42-4ea4-a067-6ff3e853c081","Type":"ContainerDied","Data":"a71f8e9a655fe731cfee84e65c0d6b9444ddc523531becc69fa5e438aece2bc6"} Mar 21 04:37:50 crc kubenswrapper[4839]: I0321 04:37:50.754321 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2z7jr" event={"ID":"36c4ce7f-79eb-4f18-8573-f6900d7812fe","Type":"ContainerStarted","Data":"e9f5e0197428b66b353464b7a4b43db5842405a6daab284a7a273837bce258b9"} Mar 21 04:37:51 crc kubenswrapper[4839]: I0321 04:37:51.762321 4839 generic.go:334] "Generic (PLEG): container finished" podID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerID="e9f5e0197428b66b353464b7a4b43db5842405a6daab284a7a273837bce258b9" exitCode=0 Mar 21 04:37:51 crc kubenswrapper[4839]: I0321 04:37:51.763771 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2z7jr" event={"ID":"36c4ce7f-79eb-4f18-8573-f6900d7812fe","Type":"ContainerDied","Data":"e9f5e0197428b66b353464b7a4b43db5842405a6daab284a7a273837bce258b9"} Mar 21 04:37:51 crc kubenswrapper[4839]: I0321 04:37:51.976018 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.059068 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-bundle\") pod \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.059148 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffk54\" (UniqueName: \"kubernetes.io/projected/e0eea72c-ae42-4ea4-a067-6ff3e853c081-kube-api-access-ffk54\") pod \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.059217 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-util\") pod \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\" (UID: \"e0eea72c-ae42-4ea4-a067-6ff3e853c081\") " Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.059777 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-bundle" (OuterVolumeSpecName: "bundle") pod "e0eea72c-ae42-4ea4-a067-6ff3e853c081" (UID: "e0eea72c-ae42-4ea4-a067-6ff3e853c081"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.066828 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0eea72c-ae42-4ea4-a067-6ff3e853c081-kube-api-access-ffk54" (OuterVolumeSpecName: "kube-api-access-ffk54") pod "e0eea72c-ae42-4ea4-a067-6ff3e853c081" (UID: "e0eea72c-ae42-4ea4-a067-6ff3e853c081"). InnerVolumeSpecName "kube-api-access-ffk54". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.073091 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-util" (OuterVolumeSpecName: "util") pod "e0eea72c-ae42-4ea4-a067-6ff3e853c081" (UID: "e0eea72c-ae42-4ea4-a067-6ff3e853c081"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.160957 4839 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.161003 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffk54\" (UniqueName: \"kubernetes.io/projected/e0eea72c-ae42-4ea4-a067-6ff3e853c081-kube-api-access-ffk54\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.161013 4839 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0eea72c-ae42-4ea4-a067-6ff3e853c081-util\") on node \"crc\" DevicePath \"\"" Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.768791 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2z7jr" event={"ID":"36c4ce7f-79eb-4f18-8573-f6900d7812fe","Type":"ContainerStarted","Data":"ebb35bbf64054558101b0109e0037494ebb7b1b197dba55792f4b36d612fa907"} Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.771147 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" event={"ID":"e0eea72c-ae42-4ea4-a067-6ff3e853c081","Type":"ContainerDied","Data":"a2995f0b972f1412efb332edf63f598b757d416c916a33749410e3cf43dc6a7a"} Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.771189 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km" Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.771297 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2995f0b972f1412efb332edf63f598b757d416c916a33749410e3cf43dc6a7a" Mar 21 04:37:52 crc kubenswrapper[4839]: I0321 04:37:52.790054 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2z7jr" podStartSLOduration=2.273817125 podStartE2EDuration="4.790032723s" podCreationTimestamp="2026-03-21 04:37:48 +0000 UTC" firstStartedPulling="2026-03-21 04:37:49.741305348 +0000 UTC m=+874.069092024" lastFinishedPulling="2026-03-21 04:37:52.257520946 +0000 UTC m=+876.585307622" observedRunningTime="2026-03-21 04:37:52.787005079 +0000 UTC m=+877.114791765" watchObservedRunningTime="2026-03-21 04:37:52.790032723 +0000 UTC m=+877.117819399" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.322768 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4"] Mar 21 04:37:54 crc kubenswrapper[4839]: E0321 04:37:54.322983 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0eea72c-ae42-4ea4-a067-6ff3e853c081" containerName="util" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.322994 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0eea72c-ae42-4ea4-a067-6ff3e853c081" containerName="util" Mar 21 04:37:54 crc kubenswrapper[4839]: E0321 04:37:54.323007 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0eea72c-ae42-4ea4-a067-6ff3e853c081" containerName="extract" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.323013 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0eea72c-ae42-4ea4-a067-6ff3e853c081" containerName="extract" Mar 21 04:37:54 crc kubenswrapper[4839]: E0321 04:37:54.323021 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0eea72c-ae42-4ea4-a067-6ff3e853c081" containerName="pull" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.323027 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0eea72c-ae42-4ea4-a067-6ff3e853c081" containerName="pull" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.323117 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0eea72c-ae42-4ea4-a067-6ff3e853c081" containerName="extract" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.323521 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.327194 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-vb5wp" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.327274 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.327336 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.333615 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4"] Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.386009 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tds6h\" (UniqueName: \"kubernetes.io/projected/fbd83ba5-ac43-45f6-8a15-78ba82a246f7-kube-api-access-tds6h\") pod \"nmstate-operator-796d4cfff4-vrlf4\" (UID: \"fbd83ba5-ac43-45f6-8a15-78ba82a246f7\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.486999 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tds6h\" (UniqueName: \"kubernetes.io/projected/fbd83ba5-ac43-45f6-8a15-78ba82a246f7-kube-api-access-tds6h\") pod \"nmstate-operator-796d4cfff4-vrlf4\" (UID: \"fbd83ba5-ac43-45f6-8a15-78ba82a246f7\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.505228 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tds6h\" (UniqueName: \"kubernetes.io/projected/fbd83ba5-ac43-45f6-8a15-78ba82a246f7-kube-api-access-tds6h\") pod \"nmstate-operator-796d4cfff4-vrlf4\" (UID: \"fbd83ba5-ac43-45f6-8a15-78ba82a246f7\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.637223 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4" Mar 21 04:37:54 crc kubenswrapper[4839]: I0321 04:37:54.871812 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4"] Mar 21 04:37:55 crc kubenswrapper[4839]: I0321 04:37:55.795828 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4" event={"ID":"fbd83ba5-ac43-45f6-8a15-78ba82a246f7","Type":"ContainerStarted","Data":"8770be52a19063d93da5e410bb53a0f306abb53f735dd1eafb8d2751664e9ddd"} Mar 21 04:37:57 crc kubenswrapper[4839]: I0321 04:37:57.809274 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4" event={"ID":"fbd83ba5-ac43-45f6-8a15-78ba82a246f7","Type":"ContainerStarted","Data":"44e281b5b66f07c1c9612fc027c822b02e4291277aac26388ba7d2a8dad97c05"} Mar 21 04:37:57 crc kubenswrapper[4839]: I0321 04:37:57.828525 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-vrlf4" podStartSLOduration=1.8959327639999999 podStartE2EDuration="3.828494178s" podCreationTimestamp="2026-03-21 04:37:54 +0000 UTC" firstStartedPulling="2026-03-21 04:37:54.889854292 +0000 UTC m=+879.217640968" lastFinishedPulling="2026-03-21 04:37:56.822415706 +0000 UTC m=+881.150202382" observedRunningTime="2026-03-21 04:37:57.825537645 +0000 UTC m=+882.153324361" watchObservedRunningTime="2026-03-21 04:37:57.828494178 +0000 UTC m=+882.156280864" Mar 21 04:37:58 crc kubenswrapper[4839]: I0321 04:37:58.954292 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc"] Mar 21 04:37:58 crc kubenswrapper[4839]: I0321 04:37:58.955100 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc" Mar 21 04:37:58 crc kubenswrapper[4839]: I0321 04:37:58.957172 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-xt4s6" Mar 21 04:37:58 crc kubenswrapper[4839]: I0321 04:37:58.963105 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:58 crc kubenswrapper[4839]: I0321 04:37:58.963310 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:58 crc kubenswrapper[4839]: I0321 04:37:58.966282 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc"] Mar 21 04:37:58 crc kubenswrapper[4839]: I0321 04:37:58.991651 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4"] Mar 21 04:37:58 crc kubenswrapper[4839]: I0321 04:37:58.992292 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:37:58 crc kubenswrapper[4839]: I0321 04:37:58.994025 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.013517 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4"] Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.017625 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-k57vv"] Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.018448 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.031329 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.043585 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5a2485ca-cb21-4edf-b074-f7ac255f45f8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-7ghd4\" (UID: \"5a2485ca-cb21-4edf-b074-f7ac255f45f8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.043631 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9sv7\" (UniqueName: \"kubernetes.io/projected/fdc1639d-742f-41a6-8cb7-318997a4a8b1-kube-api-access-f9sv7\") pod \"nmstate-metrics-9b8c8685d-z5wkc\" (UID: \"fdc1639d-742f-41a6-8cb7-318997a4a8b1\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.043763 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz8w6\" (UniqueName: \"kubernetes.io/projected/5a2485ca-cb21-4edf-b074-f7ac255f45f8-kube-api-access-lz8w6\") pod \"nmstate-webhook-5f558f5558-7ghd4\" (UID: \"5a2485ca-cb21-4edf-b074-f7ac255f45f8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.098368 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g"] Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.099059 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.104518 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.104551 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-nr6km" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.104721 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.112359 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g"] Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.145183 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw2g5\" (UniqueName: \"kubernetes.io/projected/42329e42-8b9b-45ed-ab04-bf12468d8859-kube-api-access-cw2g5\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.145224 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/42329e42-8b9b-45ed-ab04-bf12468d8859-ovs-socket\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.145246 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8e7a66bb-3731-4f75-9a7f-5b9d07a36b39-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-j5z4g\" (UID: \"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.145260 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/42329e42-8b9b-45ed-ab04-bf12468d8859-nmstate-lock\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.145283 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/42329e42-8b9b-45ed-ab04-bf12468d8859-dbus-socket\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.145366 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5a2485ca-cb21-4edf-b074-f7ac255f45f8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-7ghd4\" (UID: \"5a2485ca-cb21-4edf-b074-f7ac255f45f8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.145391 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9sv7\" (UniqueName: \"kubernetes.io/projected/fdc1639d-742f-41a6-8cb7-318997a4a8b1-kube-api-access-f9sv7\") pod \"nmstate-metrics-9b8c8685d-z5wkc\" (UID: \"fdc1639d-742f-41a6-8cb7-318997a4a8b1\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.145422 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkdsg\" (UniqueName: \"kubernetes.io/projected/8e7a66bb-3731-4f75-9a7f-5b9d07a36b39-kube-api-access-gkdsg\") pod \"nmstate-console-plugin-86f58fcf4-j5z4g\" (UID: \"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.145448 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e7a66bb-3731-4f75-9a7f-5b9d07a36b39-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-j5z4g\" (UID: \"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.145474 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz8w6\" (UniqueName: \"kubernetes.io/projected/5a2485ca-cb21-4edf-b074-f7ac255f45f8-kube-api-access-lz8w6\") pod \"nmstate-webhook-5f558f5558-7ghd4\" (UID: \"5a2485ca-cb21-4edf-b074-f7ac255f45f8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:37:59 crc kubenswrapper[4839]: E0321 04:37:59.145787 4839 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 21 04:37:59 crc kubenswrapper[4839]: E0321 04:37:59.145827 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a2485ca-cb21-4edf-b074-f7ac255f45f8-tls-key-pair podName:5a2485ca-cb21-4edf-b074-f7ac255f45f8 nodeName:}" failed. No retries permitted until 2026-03-21 04:37:59.645813338 +0000 UTC m=+883.973600014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/5a2485ca-cb21-4edf-b074-f7ac255f45f8-tls-key-pair") pod "nmstate-webhook-5f558f5558-7ghd4" (UID: "5a2485ca-cb21-4edf-b074-f7ac255f45f8") : secret "openshift-nmstate-webhook" not found Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.165500 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz8w6\" (UniqueName: \"kubernetes.io/projected/5a2485ca-cb21-4edf-b074-f7ac255f45f8-kube-api-access-lz8w6\") pod \"nmstate-webhook-5f558f5558-7ghd4\" (UID: \"5a2485ca-cb21-4edf-b074-f7ac255f45f8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.165507 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9sv7\" (UniqueName: \"kubernetes.io/projected/fdc1639d-742f-41a6-8cb7-318997a4a8b1-kube-api-access-f9sv7\") pod \"nmstate-metrics-9b8c8685d-z5wkc\" (UID: \"fdc1639d-742f-41a6-8cb7-318997a4a8b1\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.246492 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/42329e42-8b9b-45ed-ab04-bf12468d8859-nmstate-lock\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.246558 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/42329e42-8b9b-45ed-ab04-bf12468d8859-dbus-socket\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.246654 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkdsg\" (UniqueName: \"kubernetes.io/projected/8e7a66bb-3731-4f75-9a7f-5b9d07a36b39-kube-api-access-gkdsg\") pod \"nmstate-console-plugin-86f58fcf4-j5z4g\" (UID: \"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.246664 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/42329e42-8b9b-45ed-ab04-bf12468d8859-nmstate-lock\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.246760 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e7a66bb-3731-4f75-9a7f-5b9d07a36b39-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-j5z4g\" (UID: \"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.246800 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw2g5\" (UniqueName: \"kubernetes.io/projected/42329e42-8b9b-45ed-ab04-bf12468d8859-kube-api-access-cw2g5\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.246823 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/42329e42-8b9b-45ed-ab04-bf12468d8859-ovs-socket\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.246843 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8e7a66bb-3731-4f75-9a7f-5b9d07a36b39-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-j5z4g\" (UID: \"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.247092 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/42329e42-8b9b-45ed-ab04-bf12468d8859-ovs-socket\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.247382 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/42329e42-8b9b-45ed-ab04-bf12468d8859-dbus-socket\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.248044 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8e7a66bb-3731-4f75-9a7f-5b9d07a36b39-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-j5z4g\" (UID: \"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.255290 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e7a66bb-3731-4f75-9a7f-5b9d07a36b39-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-j5z4g\" (UID: \"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.269157 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw2g5\" (UniqueName: \"kubernetes.io/projected/42329e42-8b9b-45ed-ab04-bf12468d8859-kube-api-access-cw2g5\") pod \"nmstate-handler-k57vv\" (UID: \"42329e42-8b9b-45ed-ab04-bf12468d8859\") " pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.276676 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkdsg\" (UniqueName: \"kubernetes.io/projected/8e7a66bb-3731-4f75-9a7f-5b9d07a36b39-kube-api-access-gkdsg\") pod \"nmstate-console-plugin-86f58fcf4-j5z4g\" (UID: \"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.279221 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.343366 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.363784 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-95579fd9f-99csd"] Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.364476 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.387270 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-95579fd9f-99csd"] Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.420066 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.448361 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-oauth-serving-cert\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.448688 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-console-config\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.448743 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-console-serving-cert\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.448782 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-console-oauth-config\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.448800 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-service-ca\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.448819 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzkqq\" (UniqueName: \"kubernetes.io/projected/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-kube-api-access-vzkqq\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.448985 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-trusted-ca-bundle\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.528988 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc"] Mar 21 04:37:59 crc kubenswrapper[4839]: W0321 04:37:59.547979 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdc1639d_742f_41a6_8cb7_318997a4a8b1.slice/crio-22bb1435120da004a6680d61d3cda7b5011e0803c4c5a4d0f3a624c0c4568a5a WatchSource:0}: Error finding container 22bb1435120da004a6680d61d3cda7b5011e0803c4c5a4d0f3a624c0c4568a5a: Status 404 returned error can't find the container with id 22bb1435120da004a6680d61d3cda7b5011e0803c4c5a4d0f3a624c0c4568a5a Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.549822 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-console-serving-cert\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.549872 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-console-oauth-config\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.549891 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-service-ca\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.549907 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzkqq\" (UniqueName: \"kubernetes.io/projected/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-kube-api-access-vzkqq\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.549950 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-trusted-ca-bundle\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.550000 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-oauth-serving-cert\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.550026 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-console-config\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.551987 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-service-ca\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.552589 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-trusted-ca-bundle\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.552693 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-console-config\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.553456 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-oauth-serving-cert\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.555753 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-console-serving-cert\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.556261 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-console-oauth-config\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.570262 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzkqq\" (UniqueName: \"kubernetes.io/projected/e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4-kube-api-access-vzkqq\") pod \"console-95579fd9f-99csd\" (UID: \"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4\") " pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.625387 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g"] Mar 21 04:37:59 crc kubenswrapper[4839]: W0321 04:37:59.633149 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e7a66bb_3731_4f75_9a7f_5b9d07a36b39.slice/crio-3b3202817c160023a78760462711be586f38e2b1a4df731d3d58a53d7e9207e0 WatchSource:0}: Error finding container 3b3202817c160023a78760462711be586f38e2b1a4df731d3d58a53d7e9207e0: Status 404 returned error can't find the container with id 3b3202817c160023a78760462711be586f38e2b1a4df731d3d58a53d7e9207e0 Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.651093 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5a2485ca-cb21-4edf-b074-f7ac255f45f8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-7ghd4\" (UID: \"5a2485ca-cb21-4edf-b074-f7ac255f45f8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:37:59 crc kubenswrapper[4839]: E0321 04:37:59.651354 4839 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 21 04:37:59 crc kubenswrapper[4839]: E0321 04:37:59.651443 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a2485ca-cb21-4edf-b074-f7ac255f45f8-tls-key-pair podName:5a2485ca-cb21-4edf-b074-f7ac255f45f8 nodeName:}" failed. No retries permitted until 2026-03-21 04:38:00.651422965 +0000 UTC m=+884.979209641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/5a2485ca-cb21-4edf-b074-f7ac255f45f8-tls-key-pair") pod "nmstate-webhook-5f558f5558-7ghd4" (UID: "5a2485ca-cb21-4edf-b074-f7ac255f45f8") : secret "openshift-nmstate-webhook" not found Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.699803 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.819608 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-k57vv" event={"ID":"42329e42-8b9b-45ed-ab04-bf12468d8859","Type":"ContainerStarted","Data":"b544eb484d97313f14bd105e4cc46e22488a4e8e42e07ee31d8399d900ba3c0c"} Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.821164 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc" event={"ID":"fdc1639d-742f-41a6-8cb7-318997a4a8b1","Type":"ContainerStarted","Data":"22bb1435120da004a6680d61d3cda7b5011e0803c4c5a4d0f3a624c0c4568a5a"} Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.828964 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" event={"ID":"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39","Type":"ContainerStarted","Data":"3b3202817c160023a78760462711be586f38e2b1a4df731d3d58a53d7e9207e0"} Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.869646 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:37:59 crc kubenswrapper[4839]: I0321 04:37:59.897119 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-95579fd9f-99csd"] Mar 21 04:37:59 crc kubenswrapper[4839]: W0321 04:37:59.898760 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ed5e0f_3202_4b63_acb9_e689b9d1b1e4.slice/crio-32df6dfc7fea9f1b2d6f4f1d406e9bb59a75e860d7ca7361f6c842948e9b6125 WatchSource:0}: Error finding container 32df6dfc7fea9f1b2d6f4f1d406e9bb59a75e860d7ca7361f6c842948e9b6125: Status 404 returned error can't find the container with id 32df6dfc7fea9f1b2d6f4f1d406e9bb59a75e860d7ca7361f6c842948e9b6125 Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.138809 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567798-k5zv2"] Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.140746 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567798-k5zv2" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.142605 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.142696 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.142847 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.150963 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567798-k5zv2"] Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.258889 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwkcd\" (UniqueName: \"kubernetes.io/projected/ad32cfd7-7b60-4c76-8df2-eb2e65b102c3-kube-api-access-pwkcd\") pod \"auto-csr-approver-29567798-k5zv2\" (UID: \"ad32cfd7-7b60-4c76-8df2-eb2e65b102c3\") " pod="openshift-infra/auto-csr-approver-29567798-k5zv2" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.360385 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwkcd\" (UniqueName: \"kubernetes.io/projected/ad32cfd7-7b60-4c76-8df2-eb2e65b102c3-kube-api-access-pwkcd\") pod \"auto-csr-approver-29567798-k5zv2\" (UID: \"ad32cfd7-7b60-4c76-8df2-eb2e65b102c3\") " pod="openshift-infra/auto-csr-approver-29567798-k5zv2" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.384179 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwkcd\" (UniqueName: \"kubernetes.io/projected/ad32cfd7-7b60-4c76-8df2-eb2e65b102c3-kube-api-access-pwkcd\") pod \"auto-csr-approver-29567798-k5zv2\" (UID: \"ad32cfd7-7b60-4c76-8df2-eb2e65b102c3\") " pod="openshift-infra/auto-csr-approver-29567798-k5zv2" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.457996 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567798-k5zv2" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.634052 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2z7jr"] Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.656544 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567798-k5zv2"] Mar 21 04:38:00 crc kubenswrapper[4839]: W0321 04:38:00.660898 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad32cfd7_7b60_4c76_8df2_eb2e65b102c3.slice/crio-bd03cbce2553e12ebca4f813fc3769db9f3f1290d71820b0b85085cc50cc8961 WatchSource:0}: Error finding container bd03cbce2553e12ebca4f813fc3769db9f3f1290d71820b0b85085cc50cc8961: Status 404 returned error can't find the container with id bd03cbce2553e12ebca4f813fc3769db9f3f1290d71820b0b85085cc50cc8961 Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.663739 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5a2485ca-cb21-4edf-b074-f7ac255f45f8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-7ghd4\" (UID: \"5a2485ca-cb21-4edf-b074-f7ac255f45f8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.673888 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/5a2485ca-cb21-4edf-b074-f7ac255f45f8-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-7ghd4\" (UID: \"5a2485ca-cb21-4edf-b074-f7ac255f45f8\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.814366 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.834338 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567798-k5zv2" event={"ID":"ad32cfd7-7b60-4c76-8df2-eb2e65b102c3","Type":"ContainerStarted","Data":"bd03cbce2553e12ebca4f813fc3769db9f3f1290d71820b0b85085cc50cc8961"} Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.835982 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-95579fd9f-99csd" event={"ID":"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4","Type":"ContainerStarted","Data":"2bf04a9dca89aceb104eeeb4ef15898b4b2e73051968502edf52e79ebfde3a8b"} Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.836040 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-95579fd9f-99csd" event={"ID":"e0ed5e0f-3202-4b63-acb9-e689b9d1b1e4","Type":"ContainerStarted","Data":"32df6dfc7fea9f1b2d6f4f1d406e9bb59a75e860d7ca7361f6c842948e9b6125"} Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.853157 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-95579fd9f-99csd" podStartSLOduration=1.853142362 podStartE2EDuration="1.853142362s" podCreationTimestamp="2026-03-21 04:37:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:38:00.850280282 +0000 UTC m=+885.178066958" watchObservedRunningTime="2026-03-21 04:38:00.853142362 +0000 UTC m=+885.180929038" Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.980557 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:38:00 crc kubenswrapper[4839]: I0321 04:38:00.980642 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:38:01 crc kubenswrapper[4839]: I0321 04:38:00.999965 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4"] Mar 21 04:38:01 crc kubenswrapper[4839]: W0321 04:38:01.007433 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a2485ca_cb21_4edf_b074_f7ac255f45f8.slice/crio-769a7aa3ee42ef068519b51e71f3992a30992d2666f7ea25c621e539bb327f94 WatchSource:0}: Error finding container 769a7aa3ee42ef068519b51e71f3992a30992d2666f7ea25c621e539bb327f94: Status 404 returned error can't find the container with id 769a7aa3ee42ef068519b51e71f3992a30992d2666f7ea25c621e539bb327f94 Mar 21 04:38:01 crc kubenswrapper[4839]: I0321 04:38:01.841466 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" event={"ID":"5a2485ca-cb21-4edf-b074-f7ac255f45f8","Type":"ContainerStarted","Data":"769a7aa3ee42ef068519b51e71f3992a30992d2666f7ea25c621e539bb327f94"} Mar 21 04:38:01 crc kubenswrapper[4839]: I0321 04:38:01.841881 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2z7jr" podUID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerName="registry-server" containerID="cri-o://ebb35bbf64054558101b0109e0037494ebb7b1b197dba55792f4b36d612fa907" gracePeriod=2 Mar 21 04:38:03 crc kubenswrapper[4839]: I0321 04:38:03.855923 4839 generic.go:334] "Generic (PLEG): container finished" podID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerID="ebb35bbf64054558101b0109e0037494ebb7b1b197dba55792f4b36d612fa907" exitCode=0 Mar 21 04:38:03 crc kubenswrapper[4839]: I0321 04:38:03.856014 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2z7jr" event={"ID":"36c4ce7f-79eb-4f18-8573-f6900d7812fe","Type":"ContainerDied","Data":"ebb35bbf64054558101b0109e0037494ebb7b1b197dba55792f4b36d612fa907"} Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.241323 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.355180 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-catalog-content\") pod \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.355259 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqkfx\" (UniqueName: \"kubernetes.io/projected/36c4ce7f-79eb-4f18-8573-f6900d7812fe-kube-api-access-gqkfx\") pod \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.355330 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-utilities\") pod \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\" (UID: \"36c4ce7f-79eb-4f18-8573-f6900d7812fe\") " Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.356322 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-utilities" (OuterVolumeSpecName: "utilities") pod "36c4ce7f-79eb-4f18-8573-f6900d7812fe" (UID: "36c4ce7f-79eb-4f18-8573-f6900d7812fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.362046 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c4ce7f-79eb-4f18-8573-f6900d7812fe-kube-api-access-gqkfx" (OuterVolumeSpecName: "kube-api-access-gqkfx") pod "36c4ce7f-79eb-4f18-8573-f6900d7812fe" (UID: "36c4ce7f-79eb-4f18-8573-f6900d7812fe"). InnerVolumeSpecName "kube-api-access-gqkfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.456636 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqkfx\" (UniqueName: \"kubernetes.io/projected/36c4ce7f-79eb-4f18-8573-f6900d7812fe-kube-api-access-gqkfx\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.457014 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.491197 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36c4ce7f-79eb-4f18-8573-f6900d7812fe" (UID: "36c4ce7f-79eb-4f18-8573-f6900d7812fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.558410 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c4ce7f-79eb-4f18-8573-f6900d7812fe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.871251 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-k57vv" event={"ID":"42329e42-8b9b-45ed-ab04-bf12468d8859","Type":"ContainerStarted","Data":"e899cf87c8d4d05afd09205bcf13b68549c60d7d20b9278cdf01d93eac37f787"} Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.871373 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.873553 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" event={"ID":"5a2485ca-cb21-4edf-b074-f7ac255f45f8","Type":"ContainerStarted","Data":"7a7308c909cd646a0935b0a3183be77fe7128ae384bb364352bd99ec09d0e108"} Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.873641 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.875277 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc" event={"ID":"fdc1639d-742f-41a6-8cb7-318997a4a8b1","Type":"ContainerStarted","Data":"ecd307d3a0d3c7e149e08fadad6255d3a00e77345c12a089ff24cda561605b7f"} Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.876834 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" event={"ID":"8e7a66bb-3731-4f75-9a7f-5b9d07a36b39","Type":"ContainerStarted","Data":"df5cb5f435c7feedb45febdef13192c2506368609a50cd1db9d0cee2c50a3c52"} Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.879029 4839 generic.go:334] "Generic (PLEG): container finished" podID="ad32cfd7-7b60-4c76-8df2-eb2e65b102c3" containerID="3564e41aa34a1722e5c61a5b47bf82e1bb5bc4612fbb2dc888f7e8b1d996cdd6" exitCode=0 Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.879074 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567798-k5zv2" event={"ID":"ad32cfd7-7b60-4c76-8df2-eb2e65b102c3","Type":"ContainerDied","Data":"3564e41aa34a1722e5c61a5b47bf82e1bb5bc4612fbb2dc888f7e8b1d996cdd6"} Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.881629 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2z7jr" event={"ID":"36c4ce7f-79eb-4f18-8573-f6900d7812fe","Type":"ContainerDied","Data":"6d41a78d0e49e188c99866f252414054ca009b7f7c05cfeefe8bbaeba6908685"} Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.881666 4839 scope.go:117] "RemoveContainer" containerID="ebb35bbf64054558101b0109e0037494ebb7b1b197dba55792f4b36d612fa907" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.881666 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2z7jr" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.885029 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-k57vv" podStartSLOduration=2.191746971 podStartE2EDuration="7.885011652s" podCreationTimestamp="2026-03-21 04:37:58 +0000 UTC" firstStartedPulling="2026-03-21 04:37:59.39767136 +0000 UTC m=+883.725458036" lastFinishedPulling="2026-03-21 04:38:05.090936041 +0000 UTC m=+889.418722717" observedRunningTime="2026-03-21 04:38:05.884858967 +0000 UTC m=+890.212645673" watchObservedRunningTime="2026-03-21 04:38:05.885011652 +0000 UTC m=+890.212798348" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.903760 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-j5z4g" podStartSLOduration=1.435147887 podStartE2EDuration="6.903741484s" podCreationTimestamp="2026-03-21 04:37:59 +0000 UTC" firstStartedPulling="2026-03-21 04:37:59.634904704 +0000 UTC m=+883.962691380" lastFinishedPulling="2026-03-21 04:38:05.103498301 +0000 UTC m=+889.431284977" observedRunningTime="2026-03-21 04:38:05.899354912 +0000 UTC m=+890.227141628" watchObservedRunningTime="2026-03-21 04:38:05.903741484 +0000 UTC m=+890.231528170" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.935884 4839 scope.go:117] "RemoveContainer" containerID="e9f5e0197428b66b353464b7a4b43db5842405a6daab284a7a273837bce258b9" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.949693 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" podStartSLOduration=3.840377468 podStartE2EDuration="7.949668474s" podCreationTimestamp="2026-03-21 04:37:58 +0000 UTC" firstStartedPulling="2026-03-21 04:38:01.009670116 +0000 UTC m=+885.337456792" lastFinishedPulling="2026-03-21 04:38:05.118961122 +0000 UTC m=+889.446747798" observedRunningTime="2026-03-21 04:38:05.936200609 +0000 UTC m=+890.263987295" watchObservedRunningTime="2026-03-21 04:38:05.949668474 +0000 UTC m=+890.277455150" Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.956752 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2z7jr"] Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.961573 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2z7jr"] Mar 21 04:38:05 crc kubenswrapper[4839]: I0321 04:38:05.977676 4839 scope.go:117] "RemoveContainer" containerID="47364d415da3cf65d8eae0846016fd96e5b43e3a480f9d051b30b253d8dce58d" Mar 21 04:38:06 crc kubenswrapper[4839]: I0321 04:38:06.469089 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" path="/var/lib/kubelet/pods/36c4ce7f-79eb-4f18-8573-f6900d7812fe/volumes" Mar 21 04:38:07 crc kubenswrapper[4839]: I0321 04:38:07.124890 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567798-k5zv2" Mar 21 04:38:07 crc kubenswrapper[4839]: I0321 04:38:07.189321 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwkcd\" (UniqueName: \"kubernetes.io/projected/ad32cfd7-7b60-4c76-8df2-eb2e65b102c3-kube-api-access-pwkcd\") pod \"ad32cfd7-7b60-4c76-8df2-eb2e65b102c3\" (UID: \"ad32cfd7-7b60-4c76-8df2-eb2e65b102c3\") " Mar 21 04:38:07 crc kubenswrapper[4839]: I0321 04:38:07.203064 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad32cfd7-7b60-4c76-8df2-eb2e65b102c3-kube-api-access-pwkcd" (OuterVolumeSpecName: "kube-api-access-pwkcd") pod "ad32cfd7-7b60-4c76-8df2-eb2e65b102c3" (UID: "ad32cfd7-7b60-4c76-8df2-eb2e65b102c3"). InnerVolumeSpecName "kube-api-access-pwkcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:38:07 crc kubenswrapper[4839]: I0321 04:38:07.291201 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwkcd\" (UniqueName: \"kubernetes.io/projected/ad32cfd7-7b60-4c76-8df2-eb2e65b102c3-kube-api-access-pwkcd\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:07 crc kubenswrapper[4839]: I0321 04:38:07.895302 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567798-k5zv2" event={"ID":"ad32cfd7-7b60-4c76-8df2-eb2e65b102c3","Type":"ContainerDied","Data":"bd03cbce2553e12ebca4f813fc3769db9f3f1290d71820b0b85085cc50cc8961"} Mar 21 04:38:07 crc kubenswrapper[4839]: I0321 04:38:07.895340 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd03cbce2553e12ebca4f813fc3769db9f3f1290d71820b0b85085cc50cc8961" Mar 21 04:38:07 crc kubenswrapper[4839]: I0321 04:38:07.895341 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567798-k5zv2" Mar 21 04:38:08 crc kubenswrapper[4839]: I0321 04:38:08.172854 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567792-rhj6k"] Mar 21 04:38:08 crc kubenswrapper[4839]: I0321 04:38:08.176527 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567792-rhj6k"] Mar 21 04:38:08 crc kubenswrapper[4839]: I0321 04:38:08.466836 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbe4754f-40a1-43e0-827f-557507a5e7d1" path="/var/lib/kubelet/pods/fbe4754f-40a1-43e0-827f-557507a5e7d1/volumes" Mar 21 04:38:08 crc kubenswrapper[4839]: I0321 04:38:08.902880 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc" event={"ID":"fdc1639d-742f-41a6-8cb7-318997a4a8b1","Type":"ContainerStarted","Data":"3766a4a598c503e95585b55577b46395d383991dadab9eee78dc2c3a32dc87a5"} Mar 21 04:38:08 crc kubenswrapper[4839]: I0321 04:38:08.931922 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z5wkc" podStartSLOduration=2.596305621 podStartE2EDuration="10.931896116s" podCreationTimestamp="2026-03-21 04:37:58 +0000 UTC" firstStartedPulling="2026-03-21 04:37:59.553629688 +0000 UTC m=+883.881416364" lastFinishedPulling="2026-03-21 04:38:07.889220183 +0000 UTC m=+892.217006859" observedRunningTime="2026-03-21 04:38:08.927130213 +0000 UTC m=+893.254916939" watchObservedRunningTime="2026-03-21 04:38:08.931896116 +0000 UTC m=+893.259682822" Mar 21 04:38:09 crc kubenswrapper[4839]: I0321 04:38:09.699887 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:38:09 crc kubenswrapper[4839]: I0321 04:38:09.699967 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:38:09 crc kubenswrapper[4839]: I0321 04:38:09.704308 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:38:09 crc kubenswrapper[4839]: I0321 04:38:09.910678 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-95579fd9f-99csd" Mar 21 04:38:09 crc kubenswrapper[4839]: I0321 04:38:09.959399 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-bj929"] Mar 21 04:38:14 crc kubenswrapper[4839]: I0321 04:38:14.412763 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-k57vv" Mar 21 04:38:20 crc kubenswrapper[4839]: I0321 04:38:20.824675 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-7ghd4" Mar 21 04:38:30 crc kubenswrapper[4839]: I0321 04:38:30.115221 4839 scope.go:117] "RemoveContainer" containerID="c106b5183e83a440589571433cb66f6749e926bbac60bb184fac0a05ac6cf93b" Mar 21 04:38:30 crc kubenswrapper[4839]: I0321 04:38:30.980625 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:38:30 crc kubenswrapper[4839]: I0321 04:38:30.981295 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.746221 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w"] Mar 21 04:38:32 crc kubenswrapper[4839]: E0321 04:38:32.746791 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerName="extract-content" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.746805 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerName="extract-content" Mar 21 04:38:32 crc kubenswrapper[4839]: E0321 04:38:32.746825 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerName="extract-utilities" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.746833 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerName="extract-utilities" Mar 21 04:38:32 crc kubenswrapper[4839]: E0321 04:38:32.746842 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerName="registry-server" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.746849 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerName="registry-server" Mar 21 04:38:32 crc kubenswrapper[4839]: E0321 04:38:32.746862 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad32cfd7-7b60-4c76-8df2-eb2e65b102c3" containerName="oc" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.746869 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad32cfd7-7b60-4c76-8df2-eb2e65b102c3" containerName="oc" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.746988 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="36c4ce7f-79eb-4f18-8573-f6900d7812fe" containerName="registry-server" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.746998 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad32cfd7-7b60-4c76-8df2-eb2e65b102c3" containerName="oc" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.748029 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.752221 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.762227 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w"] Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.790221 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.790284 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.790335 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7fgd\" (UniqueName: \"kubernetes.io/projected/cb3471d2-6268-4816-bc09-31044e9989e7-kube-api-access-z7fgd\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.891477 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.891591 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.891630 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7fgd\" (UniqueName: \"kubernetes.io/projected/cb3471d2-6268-4816-bc09-31044e9989e7-kube-api-access-z7fgd\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.892027 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.892048 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:32 crc kubenswrapper[4839]: I0321 04:38:32.912339 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7fgd\" (UniqueName: \"kubernetes.io/projected/cb3471d2-6268-4816-bc09-31044e9989e7-kube-api-access-z7fgd\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:33 crc kubenswrapper[4839]: I0321 04:38:33.064245 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:33 crc kubenswrapper[4839]: I0321 04:38:33.284366 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w"] Mar 21 04:38:33 crc kubenswrapper[4839]: W0321 04:38:33.295716 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb3471d2_6268_4816_bc09_31044e9989e7.slice/crio-855d03996d5a5ae5267b018c13b87d9e204fe590b85da00fec74d630249a21e0 WatchSource:0}: Error finding container 855d03996d5a5ae5267b018c13b87d9e204fe590b85da00fec74d630249a21e0: Status 404 returned error can't find the container with id 855d03996d5a5ae5267b018c13b87d9e204fe590b85da00fec74d630249a21e0 Mar 21 04:38:33 crc kubenswrapper[4839]: I0321 04:38:33.548046 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" event={"ID":"cb3471d2-6268-4816-bc09-31044e9989e7","Type":"ContainerStarted","Data":"161517e37efc056c02af6449afb48e53bbb04f2c08be3c2f7ef4e330e9b394a9"} Mar 21 04:38:33 crc kubenswrapper[4839]: I0321 04:38:33.548363 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" event={"ID":"cb3471d2-6268-4816-bc09-31044e9989e7","Type":"ContainerStarted","Data":"855d03996d5a5ae5267b018c13b87d9e204fe590b85da00fec74d630249a21e0"} Mar 21 04:38:34 crc kubenswrapper[4839]: I0321 04:38:34.556745 4839 generic.go:334] "Generic (PLEG): container finished" podID="cb3471d2-6268-4816-bc09-31044e9989e7" containerID="161517e37efc056c02af6449afb48e53bbb04f2c08be3c2f7ef4e330e9b394a9" exitCode=0 Mar 21 04:38:34 crc kubenswrapper[4839]: I0321 04:38:34.556796 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" event={"ID":"cb3471d2-6268-4816-bc09-31044e9989e7","Type":"ContainerDied","Data":"161517e37efc056c02af6449afb48e53bbb04f2c08be3c2f7ef4e330e9b394a9"} Mar 21 04:38:34 crc kubenswrapper[4839]: I0321 04:38:34.998649 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-bj929" podUID="ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" containerName="console" containerID="cri-o://fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6" gracePeriod=15 Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.366527 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-bj929_ebdfec0a-a8bf-47b0-b51a-75a76d4341f2/console/0.log" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.366840 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.531849 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-oauth-serving-cert\") pod \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.531963 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-oauth-config\") pod \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.533040 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-config\") pod \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.533106 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-trusted-ca-bundle\") pod \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.533233 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-service-ca\") pod \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.533313 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-serving-cert\") pod \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.533349 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q47zt\" (UniqueName: \"kubernetes.io/projected/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-kube-api-access-q47zt\") pod \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\" (UID: \"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2\") " Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.533934 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-service-ca" (OuterVolumeSpecName: "service-ca") pod "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" (UID: "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.534051 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" (UID: "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.533928 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" (UID: "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.534730 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-config" (OuterVolumeSpecName: "console-config") pod "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" (UID: "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.538796 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" (UID: "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.539631 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-kube-api-access-q47zt" (OuterVolumeSpecName: "kube-api-access-q47zt") pod "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" (UID: "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2"). InnerVolumeSpecName "kube-api-access-q47zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.539963 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" (UID: "ebdfec0a-a8bf-47b0-b51a-75a76d4341f2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.565083 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-bj929_ebdfec0a-a8bf-47b0-b51a-75a76d4341f2/console/0.log" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.565141 4839 generic.go:334] "Generic (PLEG): container finished" podID="ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" containerID="fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6" exitCode=2 Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.565173 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bj929" event={"ID":"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2","Type":"ContainerDied","Data":"fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6"} Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.565208 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bj929" event={"ID":"ebdfec0a-a8bf-47b0-b51a-75a76d4341f2","Type":"ContainerDied","Data":"774dc5e188fdd2949c4be19591127c4007be44d80647d0129310982be9176b4a"} Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.565251 4839 scope.go:117] "RemoveContainer" containerID="fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.565369 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bj929" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.583932 4839 scope.go:117] "RemoveContainer" containerID="fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6" Mar 21 04:38:35 crc kubenswrapper[4839]: E0321 04:38:35.584405 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6\": container with ID starting with fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6 not found: ID does not exist" containerID="fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.584452 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6"} err="failed to get container status \"fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6\": rpc error: code = NotFound desc = could not find container \"fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6\": container with ID starting with fe1fa26cb747d6735cb9f60b969aa41c7d904163dcab0be59475985097c131d6 not found: ID does not exist" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.617279 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-bj929"] Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.624387 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-bj929"] Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.635015 4839 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.635059 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q47zt\" (UniqueName: \"kubernetes.io/projected/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-kube-api-access-q47zt\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.635069 4839 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.635080 4839 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.635094 4839 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-console-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.635103 4839 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:35 crc kubenswrapper[4839]: I0321 04:38:35.635112 4839 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2-service-ca\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:35 crc kubenswrapper[4839]: E0321 04:38:35.797014 4839 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb3471d2_6268_4816_bc09_31044e9989e7.slice/crio-conmon-31a28139a6805f3245c66606cfc1ecc7d0c642e4e9170453147f7a66cff1f1a3.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:38:36 crc kubenswrapper[4839]: I0321 04:38:36.466401 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" path="/var/lib/kubelet/pods/ebdfec0a-a8bf-47b0-b51a-75a76d4341f2/volumes" Mar 21 04:38:36 crc kubenswrapper[4839]: I0321 04:38:36.574836 4839 generic.go:334] "Generic (PLEG): container finished" podID="cb3471d2-6268-4816-bc09-31044e9989e7" containerID="31a28139a6805f3245c66606cfc1ecc7d0c642e4e9170453147f7a66cff1f1a3" exitCode=0 Mar 21 04:38:36 crc kubenswrapper[4839]: I0321 04:38:36.574929 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" event={"ID":"cb3471d2-6268-4816-bc09-31044e9989e7","Type":"ContainerDied","Data":"31a28139a6805f3245c66606cfc1ecc7d0c642e4e9170453147f7a66cff1f1a3"} Mar 21 04:38:37 crc kubenswrapper[4839]: I0321 04:38:37.584459 4839 generic.go:334] "Generic (PLEG): container finished" podID="cb3471d2-6268-4816-bc09-31044e9989e7" containerID="5b7eb1cd60982ec264def859b7887dc44e416d3dd03acaa89f0c34a2119dcf7f" exitCode=0 Mar 21 04:38:37 crc kubenswrapper[4839]: I0321 04:38:37.584533 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" event={"ID":"cb3471d2-6268-4816-bc09-31044e9989e7","Type":"ContainerDied","Data":"5b7eb1cd60982ec264def859b7887dc44e416d3dd03acaa89f0c34a2119dcf7f"} Mar 21 04:38:38 crc kubenswrapper[4839]: I0321 04:38:38.799907 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:38 crc kubenswrapper[4839]: I0321 04:38:38.883623 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7fgd\" (UniqueName: \"kubernetes.io/projected/cb3471d2-6268-4816-bc09-31044e9989e7-kube-api-access-z7fgd\") pod \"cb3471d2-6268-4816-bc09-31044e9989e7\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " Mar 21 04:38:38 crc kubenswrapper[4839]: I0321 04:38:38.883699 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-bundle\") pod \"cb3471d2-6268-4816-bc09-31044e9989e7\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " Mar 21 04:38:38 crc kubenswrapper[4839]: I0321 04:38:38.885598 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-bundle" (OuterVolumeSpecName: "bundle") pod "cb3471d2-6268-4816-bc09-31044e9989e7" (UID: "cb3471d2-6268-4816-bc09-31044e9989e7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:38:38 crc kubenswrapper[4839]: I0321 04:38:38.885654 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-util\") pod \"cb3471d2-6268-4816-bc09-31044e9989e7\" (UID: \"cb3471d2-6268-4816-bc09-31044e9989e7\") " Mar 21 04:38:38 crc kubenswrapper[4839]: I0321 04:38:38.885896 4839 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:38 crc kubenswrapper[4839]: I0321 04:38:38.889679 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb3471d2-6268-4816-bc09-31044e9989e7-kube-api-access-z7fgd" (OuterVolumeSpecName: "kube-api-access-z7fgd") pod "cb3471d2-6268-4816-bc09-31044e9989e7" (UID: "cb3471d2-6268-4816-bc09-31044e9989e7"). InnerVolumeSpecName "kube-api-access-z7fgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:38:38 crc kubenswrapper[4839]: I0321 04:38:38.902767 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-util" (OuterVolumeSpecName: "util") pod "cb3471d2-6268-4816-bc09-31044e9989e7" (UID: "cb3471d2-6268-4816-bc09-31044e9989e7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:38:38 crc kubenswrapper[4839]: I0321 04:38:38.986528 4839 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb3471d2-6268-4816-bc09-31044e9989e7-util\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:38 crc kubenswrapper[4839]: I0321 04:38:38.986616 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7fgd\" (UniqueName: \"kubernetes.io/projected/cb3471d2-6268-4816-bc09-31044e9989e7-kube-api-access-z7fgd\") on node \"crc\" DevicePath \"\"" Mar 21 04:38:39 crc kubenswrapper[4839]: I0321 04:38:39.600129 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" event={"ID":"cb3471d2-6268-4816-bc09-31044e9989e7","Type":"ContainerDied","Data":"855d03996d5a5ae5267b018c13b87d9e204fe590b85da00fec74d630249a21e0"} Mar 21 04:38:39 crc kubenswrapper[4839]: I0321 04:38:39.600180 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w" Mar 21 04:38:39 crc kubenswrapper[4839]: I0321 04:38:39.600182 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="855d03996d5a5ae5267b018c13b87d9e204fe590b85da00fec74d630249a21e0" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.731768 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g"] Mar 21 04:38:47 crc kubenswrapper[4839]: E0321 04:38:47.732522 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3471d2-6268-4816-bc09-31044e9989e7" containerName="pull" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.732539 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3471d2-6268-4816-bc09-31044e9989e7" containerName="pull" Mar 21 04:38:47 crc kubenswrapper[4839]: E0321 04:38:47.732550 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3471d2-6268-4816-bc09-31044e9989e7" containerName="util" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.732557 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3471d2-6268-4816-bc09-31044e9989e7" containerName="util" Mar 21 04:38:47 crc kubenswrapper[4839]: E0321 04:38:47.732616 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3471d2-6268-4816-bc09-31044e9989e7" containerName="extract" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.732627 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3471d2-6268-4816-bc09-31044e9989e7" containerName="extract" Mar 21 04:38:47 crc kubenswrapper[4839]: E0321 04:38:47.732642 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" containerName="console" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.732648 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" containerName="console" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.732769 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb3471d2-6268-4816-bc09-31044e9989e7" containerName="extract" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.732778 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebdfec0a-a8bf-47b0-b51a-75a76d4341f2" containerName="console" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.733178 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.736402 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.736416 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.736504 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-tclds" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.736613 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.736695 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.757018 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g"] Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.898177 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdpfs\" (UniqueName: \"kubernetes.io/projected/888cdc0b-241d-456a-9a9f-3ed253b3dbf3-kube-api-access-cdpfs\") pod \"metallb-operator-controller-manager-7b8d865685-2pk4g\" (UID: \"888cdc0b-241d-456a-9a9f-3ed253b3dbf3\") " pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.898249 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/888cdc0b-241d-456a-9a9f-3ed253b3dbf3-webhook-cert\") pod \"metallb-operator-controller-manager-7b8d865685-2pk4g\" (UID: \"888cdc0b-241d-456a-9a9f-3ed253b3dbf3\") " pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.898326 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/888cdc0b-241d-456a-9a9f-3ed253b3dbf3-apiservice-cert\") pod \"metallb-operator-controller-manager-7b8d865685-2pk4g\" (UID: \"888cdc0b-241d-456a-9a9f-3ed253b3dbf3\") " pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.966203 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr"] Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.966860 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.969329 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-j8csk" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.969498 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.969931 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.988869 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr"] Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.999262 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca0627e2-8115-4514-ba93-47e00a823a31-webhook-cert\") pod \"metallb-operator-webhook-server-7df97b96d6-7wvzr\" (UID: \"ca0627e2-8115-4514-ba93-47e00a823a31\") " pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.999314 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/888cdc0b-241d-456a-9a9f-3ed253b3dbf3-apiservice-cert\") pod \"metallb-operator-controller-manager-7b8d865685-2pk4g\" (UID: \"888cdc0b-241d-456a-9a9f-3ed253b3dbf3\") " pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.999342 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l29ln\" (UniqueName: \"kubernetes.io/projected/ca0627e2-8115-4514-ba93-47e00a823a31-kube-api-access-l29ln\") pod \"metallb-operator-webhook-server-7df97b96d6-7wvzr\" (UID: \"ca0627e2-8115-4514-ba93-47e00a823a31\") " pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.999392 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca0627e2-8115-4514-ba93-47e00a823a31-apiservice-cert\") pod \"metallb-operator-webhook-server-7df97b96d6-7wvzr\" (UID: \"ca0627e2-8115-4514-ba93-47e00a823a31\") " pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.999540 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdpfs\" (UniqueName: \"kubernetes.io/projected/888cdc0b-241d-456a-9a9f-3ed253b3dbf3-kube-api-access-cdpfs\") pod \"metallb-operator-controller-manager-7b8d865685-2pk4g\" (UID: \"888cdc0b-241d-456a-9a9f-3ed253b3dbf3\") " pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:47 crc kubenswrapper[4839]: I0321 04:38:47.999667 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/888cdc0b-241d-456a-9a9f-3ed253b3dbf3-webhook-cert\") pod \"metallb-operator-controller-manager-7b8d865685-2pk4g\" (UID: \"888cdc0b-241d-456a-9a9f-3ed253b3dbf3\") " pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.012936 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/888cdc0b-241d-456a-9a9f-3ed253b3dbf3-webhook-cert\") pod \"metallb-operator-controller-manager-7b8d865685-2pk4g\" (UID: \"888cdc0b-241d-456a-9a9f-3ed253b3dbf3\") " pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.025253 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/888cdc0b-241d-456a-9a9f-3ed253b3dbf3-apiservice-cert\") pod \"metallb-operator-controller-manager-7b8d865685-2pk4g\" (UID: \"888cdc0b-241d-456a-9a9f-3ed253b3dbf3\") " pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.025675 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdpfs\" (UniqueName: \"kubernetes.io/projected/888cdc0b-241d-456a-9a9f-3ed253b3dbf3-kube-api-access-cdpfs\") pod \"metallb-operator-controller-manager-7b8d865685-2pk4g\" (UID: \"888cdc0b-241d-456a-9a9f-3ed253b3dbf3\") " pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.050189 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.100069 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca0627e2-8115-4514-ba93-47e00a823a31-webhook-cert\") pod \"metallb-operator-webhook-server-7df97b96d6-7wvzr\" (UID: \"ca0627e2-8115-4514-ba93-47e00a823a31\") " pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.100128 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l29ln\" (UniqueName: \"kubernetes.io/projected/ca0627e2-8115-4514-ba93-47e00a823a31-kube-api-access-l29ln\") pod \"metallb-operator-webhook-server-7df97b96d6-7wvzr\" (UID: \"ca0627e2-8115-4514-ba93-47e00a823a31\") " pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.100148 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca0627e2-8115-4514-ba93-47e00a823a31-apiservice-cert\") pod \"metallb-operator-webhook-server-7df97b96d6-7wvzr\" (UID: \"ca0627e2-8115-4514-ba93-47e00a823a31\") " pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.104450 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca0627e2-8115-4514-ba93-47e00a823a31-apiservice-cert\") pod \"metallb-operator-webhook-server-7df97b96d6-7wvzr\" (UID: \"ca0627e2-8115-4514-ba93-47e00a823a31\") " pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.104616 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca0627e2-8115-4514-ba93-47e00a823a31-webhook-cert\") pod \"metallb-operator-webhook-server-7df97b96d6-7wvzr\" (UID: \"ca0627e2-8115-4514-ba93-47e00a823a31\") " pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.119356 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l29ln\" (UniqueName: \"kubernetes.io/projected/ca0627e2-8115-4514-ba93-47e00a823a31-kube-api-access-l29ln\") pod \"metallb-operator-webhook-server-7df97b96d6-7wvzr\" (UID: \"ca0627e2-8115-4514-ba93-47e00a823a31\") " pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.280310 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.542943 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g"] Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.575284 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr"] Mar 21 04:38:48 crc kubenswrapper[4839]: W0321 04:38:48.585943 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca0627e2_8115_4514_ba93_47e00a823a31.slice/crio-869a06b6537f812a44258d6110cf1801eb8bea39d1ca3f2634baf3b5adcbb78f WatchSource:0}: Error finding container 869a06b6537f812a44258d6110cf1801eb8bea39d1ca3f2634baf3b5adcbb78f: Status 404 returned error can't find the container with id 869a06b6537f812a44258d6110cf1801eb8bea39d1ca3f2634baf3b5adcbb78f Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.646927 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" event={"ID":"888cdc0b-241d-456a-9a9f-3ed253b3dbf3","Type":"ContainerStarted","Data":"57ebe20bbae5523fece00f859a64c821627f6a21eb01df8a452ebd5098d55e30"} Mar 21 04:38:48 crc kubenswrapper[4839]: I0321 04:38:48.647789 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" event={"ID":"ca0627e2-8115-4514-ba93-47e00a823a31","Type":"ContainerStarted","Data":"869a06b6537f812a44258d6110cf1801eb8bea39d1ca3f2634baf3b5adcbb78f"} Mar 21 04:38:53 crc kubenswrapper[4839]: I0321 04:38:53.675286 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" event={"ID":"ca0627e2-8115-4514-ba93-47e00a823a31","Type":"ContainerStarted","Data":"1b0e0cb542507ea6f4d57710659512a3587810a5cb00d131aab412e1c70391f0"} Mar 21 04:38:53 crc kubenswrapper[4839]: I0321 04:38:53.676041 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:38:53 crc kubenswrapper[4839]: I0321 04:38:53.677648 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" event={"ID":"888cdc0b-241d-456a-9a9f-3ed253b3dbf3","Type":"ContainerStarted","Data":"c692d3dfffe307c44a10c0a8af0140f6d4ef1e60ed1617a28e0bb2b9c00fb89e"} Mar 21 04:38:53 crc kubenswrapper[4839]: I0321 04:38:53.677809 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:38:53 crc kubenswrapper[4839]: I0321 04:38:53.697480 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" podStartSLOduration=2.441074143 podStartE2EDuration="6.697455862s" podCreationTimestamp="2026-03-21 04:38:47 +0000 UTC" firstStartedPulling="2026-03-21 04:38:48.588963885 +0000 UTC m=+932.916750561" lastFinishedPulling="2026-03-21 04:38:52.845345604 +0000 UTC m=+937.173132280" observedRunningTime="2026-03-21 04:38:53.692120713 +0000 UTC m=+938.019907429" watchObservedRunningTime="2026-03-21 04:38:53.697455862 +0000 UTC m=+938.025242558" Mar 21 04:38:53 crc kubenswrapper[4839]: I0321 04:38:53.718262 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" podStartSLOduration=2.432048581 podStartE2EDuration="6.718241841s" podCreationTimestamp="2026-03-21 04:38:47 +0000 UTC" firstStartedPulling="2026-03-21 04:38:48.544555977 +0000 UTC m=+932.872342653" lastFinishedPulling="2026-03-21 04:38:52.830749237 +0000 UTC m=+937.158535913" observedRunningTime="2026-03-21 04:38:53.715375841 +0000 UTC m=+938.043162517" watchObservedRunningTime="2026-03-21 04:38:53.718241841 +0000 UTC m=+938.046028517" Mar 21 04:39:00 crc kubenswrapper[4839]: I0321 04:39:00.980171 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:39:00 crc kubenswrapper[4839]: I0321 04:39:00.980514 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:39:00 crc kubenswrapper[4839]: I0321 04:39:00.980560 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:39:00 crc kubenswrapper[4839]: I0321 04:39:00.981147 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d9f640234dbdc5d617b2a0974e24e968076d94c55d65466d46d7d064392afc02"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:39:00 crc kubenswrapper[4839]: I0321 04:39:00.981198 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://d9f640234dbdc5d617b2a0974e24e968076d94c55d65466d46d7d064392afc02" gracePeriod=600 Mar 21 04:39:01 crc kubenswrapper[4839]: I0321 04:39:01.735800 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="d9f640234dbdc5d617b2a0974e24e968076d94c55d65466d46d7d064392afc02" exitCode=0 Mar 21 04:39:01 crc kubenswrapper[4839]: I0321 04:39:01.736107 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"d9f640234dbdc5d617b2a0974e24e968076d94c55d65466d46d7d064392afc02"} Mar 21 04:39:01 crc kubenswrapper[4839]: I0321 04:39:01.736131 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"3ca17db50991abbb7e584e1a028ac5195afd6abd747f7e5e9969a64ed39bcf6c"} Mar 21 04:39:01 crc kubenswrapper[4839]: I0321 04:39:01.736146 4839 scope.go:117] "RemoveContainer" containerID="da4c2d3dcbc2429432cc1a9b7a706caf5c1cde12d0441535caf710ea73866018" Mar 21 04:39:08 crc kubenswrapper[4839]: I0321 04:39:08.286096 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7df97b96d6-7wvzr" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.053995 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7b8d865685-2pk4g" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.766691 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-lzrf7"] Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.770001 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.771632 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb"] Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.772275 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-qtgp5" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.772426 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.772431 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.773273 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.774645 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.778364 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b3d06a-d515-469a-9a88-77b3f1e6c6f0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-qm7jb\" (UID: \"06b3d06a-d515-469a-9a88-77b3f1e6c6f0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.778390 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk6bf\" (UniqueName: \"kubernetes.io/projected/822ff984-89c3-48d0-b420-4ecf223f8176-kube-api-access-qk6bf\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.778420 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-reloader\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.778447 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/822ff984-89c3-48d0-b420-4ecf223f8176-frr-startup\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.778475 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-frr-sockets\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.778503 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-frr-conf\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.778517 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-metrics\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.778531 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/822ff984-89c3-48d0-b420-4ecf223f8176-metrics-certs\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.778552 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmkxb\" (UniqueName: \"kubernetes.io/projected/06b3d06a-d515-469a-9a88-77b3f1e6c6f0-kube-api-access-jmkxb\") pod \"frr-k8s-webhook-server-bcc4b6f68-qm7jb\" (UID: \"06b3d06a-d515-469a-9a88-77b3f1e6c6f0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.782012 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb"] Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.843987 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-b2wb4"] Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.845074 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-b2wb4" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.846860 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.847134 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.852895 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-d2tbg" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.853120 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.854247 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-q9zb9"] Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.866499 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879483 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-metrics-certs\") pod \"controller-7bb4cc7c98-q9zb9\" (UID: \"f0373e22-a3f9-48c6-abd6-fc8147ea49e6\") " pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879529 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-memberlist\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879586 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/822ff984-89c3-48d0-b420-4ecf223f8176-frr-startup\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879611 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs2jp\" (UniqueName: \"kubernetes.io/projected/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-kube-api-access-rs2jp\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879643 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-frr-sockets\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879669 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-cert\") pod \"controller-7bb4cc7c98-q9zb9\" (UID: \"f0373e22-a3f9-48c6-abd6-fc8147ea49e6\") " pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879692 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-metallb-excludel2\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879718 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-frr-conf\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879734 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-metrics-certs\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879759 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-metrics\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879779 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/822ff984-89c3-48d0-b420-4ecf223f8176-metrics-certs\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879806 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmkxb\" (UniqueName: \"kubernetes.io/projected/06b3d06a-d515-469a-9a88-77b3f1e6c6f0-kube-api-access-jmkxb\") pod \"frr-k8s-webhook-server-bcc4b6f68-qm7jb\" (UID: \"06b3d06a-d515-469a-9a88-77b3f1e6c6f0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879824 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b3d06a-d515-469a-9a88-77b3f1e6c6f0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-qm7jb\" (UID: \"06b3d06a-d515-469a-9a88-77b3f1e6c6f0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879846 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4btv\" (UniqueName: \"kubernetes.io/projected/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-kube-api-access-s4btv\") pod \"controller-7bb4cc7c98-q9zb9\" (UID: \"f0373e22-a3f9-48c6-abd6-fc8147ea49e6\") " pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879866 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk6bf\" (UniqueName: \"kubernetes.io/projected/822ff984-89c3-48d0-b420-4ecf223f8176-kube-api-access-qk6bf\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.879894 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-reloader\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.880281 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-reloader\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.881099 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/822ff984-89c3-48d0-b420-4ecf223f8176-frr-startup\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.881309 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-frr-sockets\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.881500 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-frr-conf\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.881704 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/822ff984-89c3-48d0-b420-4ecf223f8176-metrics\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: E0321 04:39:28.881788 4839 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 21 04:39:28 crc kubenswrapper[4839]: E0321 04:39:28.881835 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/822ff984-89c3-48d0-b420-4ecf223f8176-metrics-certs podName:822ff984-89c3-48d0-b420-4ecf223f8176 nodeName:}" failed. No retries permitted until 2026-03-21 04:39:29.381820166 +0000 UTC m=+973.709606842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/822ff984-89c3-48d0-b420-4ecf223f8176-metrics-certs") pod "frr-k8s-lzrf7" (UID: "822ff984-89c3-48d0-b420-4ecf223f8176") : secret "frr-k8s-certs-secret" not found Mar 21 04:39:28 crc kubenswrapper[4839]: E0321 04:39:28.881915 4839 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 21 04:39:28 crc kubenswrapper[4839]: E0321 04:39:28.881994 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06b3d06a-d515-469a-9a88-77b3f1e6c6f0-cert podName:06b3d06a-d515-469a-9a88-77b3f1e6c6f0 nodeName:}" failed. No retries permitted until 2026-03-21 04:39:29.38197035 +0000 UTC m=+973.709757026 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06b3d06a-d515-469a-9a88-77b3f1e6c6f0-cert") pod "frr-k8s-webhook-server-bcc4b6f68-qm7jb" (UID: "06b3d06a-d515-469a-9a88-77b3f1e6c6f0") : secret "frr-k8s-webhook-server-cert" not found Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.889826 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.930993 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-q9zb9"] Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.978369 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk6bf\" (UniqueName: \"kubernetes.io/projected/822ff984-89c3-48d0-b420-4ecf223f8176-kube-api-access-qk6bf\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.983295 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-cert\") pod \"controller-7bb4cc7c98-q9zb9\" (UID: \"f0373e22-a3f9-48c6-abd6-fc8147ea49e6\") " pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.983361 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-metallb-excludel2\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.983395 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-metrics-certs\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.983474 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4btv\" (UniqueName: \"kubernetes.io/projected/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-kube-api-access-s4btv\") pod \"controller-7bb4cc7c98-q9zb9\" (UID: \"f0373e22-a3f9-48c6-abd6-fc8147ea49e6\") " pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.983523 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-metrics-certs\") pod \"controller-7bb4cc7c98-q9zb9\" (UID: \"f0373e22-a3f9-48c6-abd6-fc8147ea49e6\") " pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.983545 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-memberlist\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.983593 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs2jp\" (UniqueName: \"kubernetes.io/projected/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-kube-api-access-rs2jp\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.985049 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-metallb-excludel2\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:28 crc kubenswrapper[4839]: E0321 04:39:28.985131 4839 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 21 04:39:28 crc kubenswrapper[4839]: E0321 04:39:28.985183 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-metrics-certs podName:6b330e86-2ac2-4bee-8a6e-364cb2f093d7 nodeName:}" failed. No retries permitted until 2026-03-21 04:39:29.485168077 +0000 UTC m=+973.812954753 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-metrics-certs") pod "speaker-b2wb4" (UID: "6b330e86-2ac2-4bee-8a6e-364cb2f093d7") : secret "speaker-certs-secret" not found Mar 21 04:39:28 crc kubenswrapper[4839]: E0321 04:39:28.985507 4839 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 21 04:39:28 crc kubenswrapper[4839]: E0321 04:39:28.985558 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-metrics-certs podName:f0373e22-a3f9-48c6-abd6-fc8147ea49e6 nodeName:}" failed. No retries permitted until 2026-03-21 04:39:29.485536938 +0000 UTC m=+973.813323614 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-metrics-certs") pod "controller-7bb4cc7c98-q9zb9" (UID: "f0373e22-a3f9-48c6-abd6-fc8147ea49e6") : secret "controller-certs-secret" not found Mar 21 04:39:28 crc kubenswrapper[4839]: E0321 04:39:28.985621 4839 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 21 04:39:28 crc kubenswrapper[4839]: E0321 04:39:28.985648 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-memberlist podName:6b330e86-2ac2-4bee-8a6e-364cb2f093d7 nodeName:}" failed. No retries permitted until 2026-03-21 04:39:29.485640781 +0000 UTC m=+973.813427457 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-memberlist") pod "speaker-b2wb4" (UID: "6b330e86-2ac2-4bee-8a6e-364cb2f093d7") : secret "metallb-memberlist" not found Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.990902 4839 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 21 04:39:28 crc kubenswrapper[4839]: I0321 04:39:28.993294 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmkxb\" (UniqueName: \"kubernetes.io/projected/06b3d06a-d515-469a-9a88-77b3f1e6c6f0-kube-api-access-jmkxb\") pod \"frr-k8s-webhook-server-bcc4b6f68-qm7jb\" (UID: \"06b3d06a-d515-469a-9a88-77b3f1e6c6f0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.008213 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-cert\") pod \"controller-7bb4cc7c98-q9zb9\" (UID: \"f0373e22-a3f9-48c6-abd6-fc8147ea49e6\") " pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.011476 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs2jp\" (UniqueName: \"kubernetes.io/projected/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-kube-api-access-rs2jp\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.018756 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4btv\" (UniqueName: \"kubernetes.io/projected/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-kube-api-access-s4btv\") pod \"controller-7bb4cc7c98-q9zb9\" (UID: \"f0373e22-a3f9-48c6-abd6-fc8147ea49e6\") " pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.388543 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b3d06a-d515-469a-9a88-77b3f1e6c6f0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-qm7jb\" (UID: \"06b3d06a-d515-469a-9a88-77b3f1e6c6f0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.388706 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/822ff984-89c3-48d0-b420-4ecf223f8176-metrics-certs\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.391744 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/822ff984-89c3-48d0-b420-4ecf223f8176-metrics-certs\") pod \"frr-k8s-lzrf7\" (UID: \"822ff984-89c3-48d0-b420-4ecf223f8176\") " pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.392189 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b3d06a-d515-469a-9a88-77b3f1e6c6f0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-qm7jb\" (UID: \"06b3d06a-d515-469a-9a88-77b3f1e6c6f0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.402364 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.490393 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-metrics-certs\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.490708 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-metrics-certs\") pod \"controller-7bb4cc7c98-q9zb9\" (UID: \"f0373e22-a3f9-48c6-abd6-fc8147ea49e6\") " pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.490771 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-memberlist\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:29 crc kubenswrapper[4839]: E0321 04:39:29.492775 4839 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 21 04:39:29 crc kubenswrapper[4839]: E0321 04:39:29.492842 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-memberlist podName:6b330e86-2ac2-4bee-8a6e-364cb2f093d7 nodeName:}" failed. No retries permitted until 2026-03-21 04:39:30.492822469 +0000 UTC m=+974.820609145 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-memberlist") pod "speaker-b2wb4" (UID: "6b330e86-2ac2-4bee-8a6e-364cb2f093d7") : secret "metallb-memberlist" not found Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.496286 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0373e22-a3f9-48c6-abd6-fc8147ea49e6-metrics-certs\") pod \"controller-7bb4cc7c98-q9zb9\" (UID: \"f0373e22-a3f9-48c6-abd6-fc8147ea49e6\") " pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.496548 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-metrics-certs\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.546398 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.691370 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.826044 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb"] Mar 21 04:39:29 crc kubenswrapper[4839]: W0321 04:39:29.832124 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06b3d06a_d515_469a_9a88_77b3f1e6c6f0.slice/crio-f8dd9234facab32c48e559e6f4cb604cf7d06eb6bc747d7815442d5d620ef3dc WatchSource:0}: Error finding container f8dd9234facab32c48e559e6f4cb604cf7d06eb6bc747d7815442d5d620ef3dc: Status 404 returned error can't find the container with id f8dd9234facab32c48e559e6f4cb604cf7d06eb6bc747d7815442d5d620ef3dc Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.834277 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 04:39:29 crc kubenswrapper[4839]: I0321 04:39:29.955307 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-q9zb9"] Mar 21 04:39:29 crc kubenswrapper[4839]: W0321 04:39:29.960052 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0373e22_a3f9_48c6_abd6_fc8147ea49e6.slice/crio-924d7bcbde774bd43b73cc101eb7776cf9c377e8b5d1ee41f29bdb41a709d632 WatchSource:0}: Error finding container 924d7bcbde774bd43b73cc101eb7776cf9c377e8b5d1ee41f29bdb41a709d632: Status 404 returned error can't find the container with id 924d7bcbde774bd43b73cc101eb7776cf9c377e8b5d1ee41f29bdb41a709d632 Mar 21 04:39:30 crc kubenswrapper[4839]: I0321 04:39:30.506018 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-memberlist\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:30 crc kubenswrapper[4839]: I0321 04:39:30.519108 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6b330e86-2ac2-4bee-8a6e-364cb2f093d7-memberlist\") pod \"speaker-b2wb4\" (UID: \"6b330e86-2ac2-4bee-8a6e-364cb2f093d7\") " pod="metallb-system/speaker-b2wb4" Mar 21 04:39:30 crc kubenswrapper[4839]: I0321 04:39:30.619718 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-q9zb9" event={"ID":"f0373e22-a3f9-48c6-abd6-fc8147ea49e6","Type":"ContainerStarted","Data":"b68f5220ef1f6b732d5018555e549c5121a31dbec8e183b5866f019908cd4660"} Mar 21 04:39:30 crc kubenswrapper[4839]: I0321 04:39:30.619791 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-q9zb9" event={"ID":"f0373e22-a3f9-48c6-abd6-fc8147ea49e6","Type":"ContainerStarted","Data":"61ec918888cba3cb29538cac06ead0d68a6acbc5f28a437c83970641a050afe1"} Mar 21 04:39:30 crc kubenswrapper[4839]: I0321 04:39:30.619807 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-q9zb9" event={"ID":"f0373e22-a3f9-48c6-abd6-fc8147ea49e6","Type":"ContainerStarted","Data":"924d7bcbde774bd43b73cc101eb7776cf9c377e8b5d1ee41f29bdb41a709d632"} Mar 21 04:39:30 crc kubenswrapper[4839]: I0321 04:39:30.619928 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:30 crc kubenswrapper[4839]: I0321 04:39:30.621124 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lzrf7" event={"ID":"822ff984-89c3-48d0-b420-4ecf223f8176","Type":"ContainerStarted","Data":"ba42772066ec959ddafaf8cd84741af0ac4d463f4e44cf3893305b9d3ba41a03"} Mar 21 04:39:30 crc kubenswrapper[4839]: I0321 04:39:30.622845 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" event={"ID":"06b3d06a-d515-469a-9a88-77b3f1e6c6f0","Type":"ContainerStarted","Data":"f8dd9234facab32c48e559e6f4cb604cf7d06eb6bc747d7815442d5d620ef3dc"} Mar 21 04:39:30 crc kubenswrapper[4839]: I0321 04:39:30.651969 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-q9zb9" podStartSLOduration=2.651943286 podStartE2EDuration="2.651943286s" podCreationTimestamp="2026-03-21 04:39:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:39:30.647232235 +0000 UTC m=+974.975018931" watchObservedRunningTime="2026-03-21 04:39:30.651943286 +0000 UTC m=+974.979730002" Mar 21 04:39:30 crc kubenswrapper[4839]: I0321 04:39:30.709697 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-b2wb4" Mar 21 04:39:31 crc kubenswrapper[4839]: I0321 04:39:31.629847 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-b2wb4" event={"ID":"6b330e86-2ac2-4bee-8a6e-364cb2f093d7","Type":"ContainerStarted","Data":"5f20e86978e4b21f36c5428c36388b126f2db7acec1b5806b78d5311837d5785"} Mar 21 04:39:31 crc kubenswrapper[4839]: I0321 04:39:31.630215 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-b2wb4" event={"ID":"6b330e86-2ac2-4bee-8a6e-364cb2f093d7","Type":"ContainerStarted","Data":"68418a673127c1e259429faa893b533c6e96a97fb321206a185c6baa1e3e125f"} Mar 21 04:39:31 crc kubenswrapper[4839]: I0321 04:39:31.630230 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-b2wb4" event={"ID":"6b330e86-2ac2-4bee-8a6e-364cb2f093d7","Type":"ContainerStarted","Data":"54f9dccf15e053f6cd6469db3f8d9afa547b5dae24f884f45d7072078aeb511a"} Mar 21 04:39:31 crc kubenswrapper[4839]: I0321 04:39:31.630381 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-b2wb4" Mar 21 04:39:31 crc kubenswrapper[4839]: I0321 04:39:31.646843 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-b2wb4" podStartSLOduration=3.646819308 podStartE2EDuration="3.646819308s" podCreationTimestamp="2026-03-21 04:39:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:39:31.644847283 +0000 UTC m=+975.972633959" watchObservedRunningTime="2026-03-21 04:39:31.646819308 +0000 UTC m=+975.974605984" Mar 21 04:39:37 crc kubenswrapper[4839]: I0321 04:39:37.675074 4839 generic.go:334] "Generic (PLEG): container finished" podID="822ff984-89c3-48d0-b420-4ecf223f8176" containerID="565bac187c28c85d8dfcd7bd88242ad8104b6bd3ee7a4f401ac27d881d077f3a" exitCode=0 Mar 21 04:39:37 crc kubenswrapper[4839]: I0321 04:39:37.675406 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lzrf7" event={"ID":"822ff984-89c3-48d0-b420-4ecf223f8176","Type":"ContainerDied","Data":"565bac187c28c85d8dfcd7bd88242ad8104b6bd3ee7a4f401ac27d881d077f3a"} Mar 21 04:39:37 crc kubenswrapper[4839]: I0321 04:39:37.677653 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" event={"ID":"06b3d06a-d515-469a-9a88-77b3f1e6c6f0","Type":"ContainerStarted","Data":"56dc8d83024726e36a1f005b734afb7ee64b3dc6b8573f3f079ae6b51be5a03c"} Mar 21 04:39:37 crc kubenswrapper[4839]: I0321 04:39:37.678348 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:37 crc kubenswrapper[4839]: I0321 04:39:37.736577 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" podStartSLOduration=2.747086549 podStartE2EDuration="9.736495689s" podCreationTimestamp="2026-03-21 04:39:28 +0000 UTC" firstStartedPulling="2026-03-21 04:39:29.834093077 +0000 UTC m=+974.161879753" lastFinishedPulling="2026-03-21 04:39:36.823502197 +0000 UTC m=+981.151288893" observedRunningTime="2026-03-21 04:39:37.694088912 +0000 UTC m=+982.021891929" watchObservedRunningTime="2026-03-21 04:39:37.736495689 +0000 UTC m=+982.064282365" Mar 21 04:39:38 crc kubenswrapper[4839]: I0321 04:39:38.689677 4839 generic.go:334] "Generic (PLEG): container finished" podID="822ff984-89c3-48d0-b420-4ecf223f8176" containerID="e2c95d7b3b135c19b82da28111b6b85c7851811d4dc2ed3bbc6166eb62c44fc2" exitCode=0 Mar 21 04:39:38 crc kubenswrapper[4839]: I0321 04:39:38.689767 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lzrf7" event={"ID":"822ff984-89c3-48d0-b420-4ecf223f8176","Type":"ContainerDied","Data":"e2c95d7b3b135c19b82da28111b6b85c7851811d4dc2ed3bbc6166eb62c44fc2"} Mar 21 04:39:39 crc kubenswrapper[4839]: I0321 04:39:39.705644 4839 generic.go:334] "Generic (PLEG): container finished" podID="822ff984-89c3-48d0-b420-4ecf223f8176" containerID="fee246ca150bd459f2420666e9da817cd62f654c9584340fc5779d4c1a29128a" exitCode=0 Mar 21 04:39:39 crc kubenswrapper[4839]: I0321 04:39:39.706250 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lzrf7" event={"ID":"822ff984-89c3-48d0-b420-4ecf223f8176","Type":"ContainerDied","Data":"fee246ca150bd459f2420666e9da817cd62f654c9584340fc5779d4c1a29128a"} Mar 21 04:39:40 crc kubenswrapper[4839]: I0321 04:39:40.717232 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lzrf7" event={"ID":"822ff984-89c3-48d0-b420-4ecf223f8176","Type":"ContainerStarted","Data":"6cbc1c4b8cb154df282fd3655e9a70ad7fadc36b8d3c20ce4d9ea29329d1a2a6"} Mar 21 04:39:40 crc kubenswrapper[4839]: I0321 04:39:40.717664 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lzrf7" event={"ID":"822ff984-89c3-48d0-b420-4ecf223f8176","Type":"ContainerStarted","Data":"d6afc1444d4a08a428f6f1ad5a0020b67e53b28d3807691f1253dce24bad5553"} Mar 21 04:39:40 crc kubenswrapper[4839]: I0321 04:39:40.717687 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lzrf7" event={"ID":"822ff984-89c3-48d0-b420-4ecf223f8176","Type":"ContainerStarted","Data":"09583428a431967fdeee2fdd5a64a97d93655d0b8a6f4620d364eb2bb9293a9a"} Mar 21 04:39:40 crc kubenswrapper[4839]: I0321 04:39:40.717703 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lzrf7" event={"ID":"822ff984-89c3-48d0-b420-4ecf223f8176","Type":"ContainerStarted","Data":"4340d096c43af3372f6634220f77336f714d48d74e9bbaf7919145c20384c29c"} Mar 21 04:39:40 crc kubenswrapper[4839]: I0321 04:39:40.717719 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lzrf7" event={"ID":"822ff984-89c3-48d0-b420-4ecf223f8176","Type":"ContainerStarted","Data":"4461e8f0f78d715175ffaf25277a13356d4cd70a7e5240aed9ea68302cc779e0"} Mar 21 04:39:41 crc kubenswrapper[4839]: I0321 04:39:41.728397 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lzrf7" event={"ID":"822ff984-89c3-48d0-b420-4ecf223f8176","Type":"ContainerStarted","Data":"ab37b8eb272fbf8a026d266846e789be95e7c4c569d14d26432c35b673ce942d"} Mar 21 04:39:41 crc kubenswrapper[4839]: I0321 04:39:41.728590 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:41 crc kubenswrapper[4839]: I0321 04:39:41.764980 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-lzrf7" podStartSLOduration=6.847850868 podStartE2EDuration="13.764963916s" podCreationTimestamp="2026-03-21 04:39:28 +0000 UTC" firstStartedPulling="2026-03-21 04:39:29.891282226 +0000 UTC m=+974.219068902" lastFinishedPulling="2026-03-21 04:39:36.808395264 +0000 UTC m=+981.136181950" observedRunningTime="2026-03-21 04:39:41.760328967 +0000 UTC m=+986.088115733" watchObservedRunningTime="2026-03-21 04:39:41.764963916 +0000 UTC m=+986.092750592" Mar 21 04:39:44 crc kubenswrapper[4839]: I0321 04:39:44.691770 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:44 crc kubenswrapper[4839]: I0321 04:39:44.732007 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:49 crc kubenswrapper[4839]: I0321 04:39:49.407340 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-qm7jb" Mar 21 04:39:49 crc kubenswrapper[4839]: I0321 04:39:49.550727 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-q9zb9" Mar 21 04:39:49 crc kubenswrapper[4839]: I0321 04:39:49.694735 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-lzrf7" Mar 21 04:39:50 crc kubenswrapper[4839]: I0321 04:39:50.713510 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-b2wb4" Mar 21 04:39:53 crc kubenswrapper[4839]: I0321 04:39:53.493520 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7pfkn"] Mar 21 04:39:53 crc kubenswrapper[4839]: I0321 04:39:53.494834 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7pfkn" Mar 21 04:39:53 crc kubenswrapper[4839]: I0321 04:39:53.496856 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-5zf2m" Mar 21 04:39:53 crc kubenswrapper[4839]: I0321 04:39:53.496970 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 21 04:39:53 crc kubenswrapper[4839]: I0321 04:39:53.497413 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 21 04:39:53 crc kubenswrapper[4839]: I0321 04:39:53.537313 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwwmm\" (UniqueName: \"kubernetes.io/projected/e7f6aac9-7315-491e-b5b1-e0a5e43c1387-kube-api-access-hwwmm\") pod \"openstack-operator-index-7pfkn\" (UID: \"e7f6aac9-7315-491e-b5b1-e0a5e43c1387\") " pod="openstack-operators/openstack-operator-index-7pfkn" Mar 21 04:39:53 crc kubenswrapper[4839]: I0321 04:39:53.552065 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7pfkn"] Mar 21 04:39:53 crc kubenswrapper[4839]: I0321 04:39:53.638744 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwwmm\" (UniqueName: \"kubernetes.io/projected/e7f6aac9-7315-491e-b5b1-e0a5e43c1387-kube-api-access-hwwmm\") pod \"openstack-operator-index-7pfkn\" (UID: \"e7f6aac9-7315-491e-b5b1-e0a5e43c1387\") " pod="openstack-operators/openstack-operator-index-7pfkn" Mar 21 04:39:53 crc kubenswrapper[4839]: I0321 04:39:53.660814 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwwmm\" (UniqueName: \"kubernetes.io/projected/e7f6aac9-7315-491e-b5b1-e0a5e43c1387-kube-api-access-hwwmm\") pod \"openstack-operator-index-7pfkn\" (UID: \"e7f6aac9-7315-491e-b5b1-e0a5e43c1387\") " pod="openstack-operators/openstack-operator-index-7pfkn" Mar 21 04:39:53 crc kubenswrapper[4839]: I0321 04:39:53.824282 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7pfkn" Mar 21 04:39:54 crc kubenswrapper[4839]: I0321 04:39:54.290086 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7pfkn"] Mar 21 04:39:54 crc kubenswrapper[4839]: I0321 04:39:54.820371 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7pfkn" event={"ID":"e7f6aac9-7315-491e-b5b1-e0a5e43c1387","Type":"ContainerStarted","Data":"f9459f4578385c2c260c563d32018cecfef8f292c4211136802ae9d2d156071f"} Mar 21 04:39:55 crc kubenswrapper[4839]: I0321 04:39:55.672561 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7pfkn"] Mar 21 04:39:56 crc kubenswrapper[4839]: I0321 04:39:56.286564 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-lj8h4"] Mar 21 04:39:56 crc kubenswrapper[4839]: I0321 04:39:56.287281 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lj8h4" Mar 21 04:39:56 crc kubenswrapper[4839]: I0321 04:39:56.295872 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lj8h4"] Mar 21 04:39:56 crc kubenswrapper[4839]: I0321 04:39:56.474529 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95j7t\" (UniqueName: \"kubernetes.io/projected/6ff65f56-ff89-43c6-b087-6d3c3b72d2ef-kube-api-access-95j7t\") pod \"openstack-operator-index-lj8h4\" (UID: \"6ff65f56-ff89-43c6-b087-6d3c3b72d2ef\") " pod="openstack-operators/openstack-operator-index-lj8h4" Mar 21 04:39:56 crc kubenswrapper[4839]: I0321 04:39:56.575705 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95j7t\" (UniqueName: \"kubernetes.io/projected/6ff65f56-ff89-43c6-b087-6d3c3b72d2ef-kube-api-access-95j7t\") pod \"openstack-operator-index-lj8h4\" (UID: \"6ff65f56-ff89-43c6-b087-6d3c3b72d2ef\") " pod="openstack-operators/openstack-operator-index-lj8h4" Mar 21 04:39:56 crc kubenswrapper[4839]: I0321 04:39:56.597870 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95j7t\" (UniqueName: \"kubernetes.io/projected/6ff65f56-ff89-43c6-b087-6d3c3b72d2ef-kube-api-access-95j7t\") pod \"openstack-operator-index-lj8h4\" (UID: \"6ff65f56-ff89-43c6-b087-6d3c3b72d2ef\") " pod="openstack-operators/openstack-operator-index-lj8h4" Mar 21 04:39:56 crc kubenswrapper[4839]: I0321 04:39:56.612633 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lj8h4" Mar 21 04:39:57 crc kubenswrapper[4839]: I0321 04:39:57.887122 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lj8h4"] Mar 21 04:39:58 crc kubenswrapper[4839]: I0321 04:39:58.850860 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7pfkn" event={"ID":"e7f6aac9-7315-491e-b5b1-e0a5e43c1387","Type":"ContainerStarted","Data":"4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057"} Mar 21 04:39:58 crc kubenswrapper[4839]: I0321 04:39:58.851024 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-7pfkn" podUID="e7f6aac9-7315-491e-b5b1-e0a5e43c1387" containerName="registry-server" containerID="cri-o://4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057" gracePeriod=2 Mar 21 04:39:58 crc kubenswrapper[4839]: I0321 04:39:58.852990 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lj8h4" event={"ID":"6ff65f56-ff89-43c6-b087-6d3c3b72d2ef","Type":"ContainerStarted","Data":"d8d5f4623ef46362fa062476beb5cd44fe699aee49dc6ca663cf74cf54f14f4b"} Mar 21 04:39:58 crc kubenswrapper[4839]: I0321 04:39:58.853045 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lj8h4" event={"ID":"6ff65f56-ff89-43c6-b087-6d3c3b72d2ef","Type":"ContainerStarted","Data":"9bf5dc4a328f57563a89f14d7276635e4dd98c6451e9b20265d01aba9066d661"} Mar 21 04:39:58 crc kubenswrapper[4839]: I0321 04:39:58.880081 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7pfkn" podStartSLOduration=1.985557412 podStartE2EDuration="5.880056452s" podCreationTimestamp="2026-03-21 04:39:53 +0000 UTC" firstStartedPulling="2026-03-21 04:39:54.303742438 +0000 UTC m=+998.631529154" lastFinishedPulling="2026-03-21 04:39:58.198241508 +0000 UTC m=+1002.526028194" observedRunningTime="2026-03-21 04:39:58.874759834 +0000 UTC m=+1003.202546530" watchObservedRunningTime="2026-03-21 04:39:58.880056452 +0000 UTC m=+1003.207843158" Mar 21 04:39:58 crc kubenswrapper[4839]: I0321 04:39:58.901129 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-lj8h4" podStartSLOduration=2.849441026 podStartE2EDuration="2.901096951s" podCreationTimestamp="2026-03-21 04:39:56 +0000 UTC" firstStartedPulling="2026-03-21 04:39:58.147314243 +0000 UTC m=+1002.475100919" lastFinishedPulling="2026-03-21 04:39:58.198970168 +0000 UTC m=+1002.526756844" observedRunningTime="2026-03-21 04:39:58.892021587 +0000 UTC m=+1003.219808273" watchObservedRunningTime="2026-03-21 04:39:58.901096951 +0000 UTC m=+1003.228883667" Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.345414 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7pfkn" Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.516265 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwwmm\" (UniqueName: \"kubernetes.io/projected/e7f6aac9-7315-491e-b5b1-e0a5e43c1387-kube-api-access-hwwmm\") pod \"e7f6aac9-7315-491e-b5b1-e0a5e43c1387\" (UID: \"e7f6aac9-7315-491e-b5b1-e0a5e43c1387\") " Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.523040 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7f6aac9-7315-491e-b5b1-e0a5e43c1387-kube-api-access-hwwmm" (OuterVolumeSpecName: "kube-api-access-hwwmm") pod "e7f6aac9-7315-491e-b5b1-e0a5e43c1387" (UID: "e7f6aac9-7315-491e-b5b1-e0a5e43c1387"). InnerVolumeSpecName "kube-api-access-hwwmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.618152 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwwmm\" (UniqueName: \"kubernetes.io/projected/e7f6aac9-7315-491e-b5b1-e0a5e43c1387-kube-api-access-hwwmm\") on node \"crc\" DevicePath \"\"" Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.859500 4839 generic.go:334] "Generic (PLEG): container finished" podID="e7f6aac9-7315-491e-b5b1-e0a5e43c1387" containerID="4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057" exitCode=0 Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.859585 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7pfkn" Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.859596 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7pfkn" event={"ID":"e7f6aac9-7315-491e-b5b1-e0a5e43c1387","Type":"ContainerDied","Data":"4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057"} Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.859633 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7pfkn" event={"ID":"e7f6aac9-7315-491e-b5b1-e0a5e43c1387","Type":"ContainerDied","Data":"f9459f4578385c2c260c563d32018cecfef8f292c4211136802ae9d2d156071f"} Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.859650 4839 scope.go:117] "RemoveContainer" containerID="4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057" Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.874506 4839 scope.go:117] "RemoveContainer" containerID="4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057" Mar 21 04:39:59 crc kubenswrapper[4839]: E0321 04:39:59.874893 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057\": container with ID starting with 4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057 not found: ID does not exist" containerID="4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057" Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.874932 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057"} err="failed to get container status \"4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057\": rpc error: code = NotFound desc = could not find container \"4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057\": container with ID starting with 4b98942741c949f340c09c4da5ff5a74fbdefa3105379c38b5bbee4104f0a057 not found: ID does not exist" Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.887047 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7pfkn"] Mar 21 04:39:59 crc kubenswrapper[4839]: I0321 04:39:59.890773 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-7pfkn"] Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.145511 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567800-hzcbk"] Mar 21 04:40:00 crc kubenswrapper[4839]: E0321 04:40:00.145936 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7f6aac9-7315-491e-b5b1-e0a5e43c1387" containerName="registry-server" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.145960 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f6aac9-7315-491e-b5b1-e0a5e43c1387" containerName="registry-server" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.146178 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7f6aac9-7315-491e-b5b1-e0a5e43c1387" containerName="registry-server" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.146798 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567800-hzcbk" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.150489 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567800-hzcbk"] Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.150617 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.150617 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.150631 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.326616 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9nrd\" (UniqueName: \"kubernetes.io/projected/4a2cd29b-967b-4cf6-9902-6f30ad049cb1-kube-api-access-t9nrd\") pod \"auto-csr-approver-29567800-hzcbk\" (UID: \"4a2cd29b-967b-4cf6-9902-6f30ad049cb1\") " pod="openshift-infra/auto-csr-approver-29567800-hzcbk" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.428663 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9nrd\" (UniqueName: \"kubernetes.io/projected/4a2cd29b-967b-4cf6-9902-6f30ad049cb1-kube-api-access-t9nrd\") pod \"auto-csr-approver-29567800-hzcbk\" (UID: \"4a2cd29b-967b-4cf6-9902-6f30ad049cb1\") " pod="openshift-infra/auto-csr-approver-29567800-hzcbk" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.454934 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9nrd\" (UniqueName: \"kubernetes.io/projected/4a2cd29b-967b-4cf6-9902-6f30ad049cb1-kube-api-access-t9nrd\") pod \"auto-csr-approver-29567800-hzcbk\" (UID: \"4a2cd29b-967b-4cf6-9902-6f30ad049cb1\") " pod="openshift-infra/auto-csr-approver-29567800-hzcbk" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.465055 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567800-hzcbk" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.468850 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7f6aac9-7315-491e-b5b1-e0a5e43c1387" path="/var/lib/kubelet/pods/e7f6aac9-7315-491e-b5b1-e0a5e43c1387/volumes" Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.856167 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567800-hzcbk"] Mar 21 04:40:00 crc kubenswrapper[4839]: I0321 04:40:00.871909 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567800-hzcbk" event={"ID":"4a2cd29b-967b-4cf6-9902-6f30ad049cb1","Type":"ContainerStarted","Data":"5678ff93115193b208da67b4ac1cc4702c554276d60ee0d4de4653dda74e182d"} Mar 21 04:40:02 crc kubenswrapper[4839]: I0321 04:40:02.886869 4839 generic.go:334] "Generic (PLEG): container finished" podID="4a2cd29b-967b-4cf6-9902-6f30ad049cb1" containerID="54072f0390a561fb948d238ef6ee4fb04223cd43a9ba8e8eef297b621c8367df" exitCode=0 Mar 21 04:40:02 crc kubenswrapper[4839]: I0321 04:40:02.886922 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567800-hzcbk" event={"ID":"4a2cd29b-967b-4cf6-9902-6f30ad049cb1","Type":"ContainerDied","Data":"54072f0390a561fb948d238ef6ee4fb04223cd43a9ba8e8eef297b621c8367df"} Mar 21 04:40:04 crc kubenswrapper[4839]: I0321 04:40:04.230839 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567800-hzcbk" Mar 21 04:40:04 crc kubenswrapper[4839]: I0321 04:40:04.415050 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9nrd\" (UniqueName: \"kubernetes.io/projected/4a2cd29b-967b-4cf6-9902-6f30ad049cb1-kube-api-access-t9nrd\") pod \"4a2cd29b-967b-4cf6-9902-6f30ad049cb1\" (UID: \"4a2cd29b-967b-4cf6-9902-6f30ad049cb1\") " Mar 21 04:40:04 crc kubenswrapper[4839]: I0321 04:40:04.424376 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2cd29b-967b-4cf6-9902-6f30ad049cb1-kube-api-access-t9nrd" (OuterVolumeSpecName: "kube-api-access-t9nrd") pod "4a2cd29b-967b-4cf6-9902-6f30ad049cb1" (UID: "4a2cd29b-967b-4cf6-9902-6f30ad049cb1"). InnerVolumeSpecName "kube-api-access-t9nrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:40:04 crc kubenswrapper[4839]: I0321 04:40:04.516861 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9nrd\" (UniqueName: \"kubernetes.io/projected/4a2cd29b-967b-4cf6-9902-6f30ad049cb1-kube-api-access-t9nrd\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:04 crc kubenswrapper[4839]: I0321 04:40:04.908059 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567800-hzcbk" event={"ID":"4a2cd29b-967b-4cf6-9902-6f30ad049cb1","Type":"ContainerDied","Data":"5678ff93115193b208da67b4ac1cc4702c554276d60ee0d4de4653dda74e182d"} Mar 21 04:40:04 crc kubenswrapper[4839]: I0321 04:40:04.908382 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5678ff93115193b208da67b4ac1cc4702c554276d60ee0d4de4653dda74e182d" Mar 21 04:40:04 crc kubenswrapper[4839]: I0321 04:40:04.908131 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567800-hzcbk" Mar 21 04:40:05 crc kubenswrapper[4839]: I0321 04:40:05.297616 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567794-rclnt"] Mar 21 04:40:05 crc kubenswrapper[4839]: I0321 04:40:05.302936 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567794-rclnt"] Mar 21 04:40:06 crc kubenswrapper[4839]: I0321 04:40:06.466783 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dfa2356-3aca-4ed1-bfce-93cc8857825d" path="/var/lib/kubelet/pods/2dfa2356-3aca-4ed1-bfce-93cc8857825d/volumes" Mar 21 04:40:06 crc kubenswrapper[4839]: I0321 04:40:06.613117 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-lj8h4" Mar 21 04:40:06 crc kubenswrapper[4839]: I0321 04:40:06.613278 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-lj8h4" Mar 21 04:40:06 crc kubenswrapper[4839]: I0321 04:40:06.661758 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-lj8h4" Mar 21 04:40:06 crc kubenswrapper[4839]: I0321 04:40:06.951451 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-lj8h4" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.087344 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6q2g2"] Mar 21 04:40:10 crc kubenswrapper[4839]: E0321 04:40:10.089232 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2cd29b-967b-4cf6-9902-6f30ad049cb1" containerName="oc" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.089278 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2cd29b-967b-4cf6-9902-6f30ad049cb1" containerName="oc" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.089633 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2cd29b-967b-4cf6-9902-6f30ad049cb1" containerName="oc" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.091563 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.108666 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6q2g2"] Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.194178 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-utilities\") pod \"community-operators-6q2g2\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.194707 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-catalog-content\") pod \"community-operators-6q2g2\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.194768 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz8c9\" (UniqueName: \"kubernetes.io/projected/befc88a7-caca-450d-b23e-c4382b36217e-kube-api-access-fz8c9\") pod \"community-operators-6q2g2\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.295667 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-utilities\") pod \"community-operators-6q2g2\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.295721 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-catalog-content\") pod \"community-operators-6q2g2\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.295775 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz8c9\" (UniqueName: \"kubernetes.io/projected/befc88a7-caca-450d-b23e-c4382b36217e-kube-api-access-fz8c9\") pod \"community-operators-6q2g2\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.296197 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-catalog-content\") pod \"community-operators-6q2g2\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.296255 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-utilities\") pod \"community-operators-6q2g2\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.321614 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz8c9\" (UniqueName: \"kubernetes.io/projected/befc88a7-caca-450d-b23e-c4382b36217e-kube-api-access-fz8c9\") pod \"community-operators-6q2g2\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.422447 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.900476 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6q2g2"] Mar 21 04:40:10 crc kubenswrapper[4839]: I0321 04:40:10.962321 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q2g2" event={"ID":"befc88a7-caca-450d-b23e-c4382b36217e","Type":"ContainerStarted","Data":"ce9d7fdc03ee552772de04055fb99bd937e17bd28a54562444504527ae42320e"} Mar 21 04:40:11 crc kubenswrapper[4839]: I0321 04:40:11.985688 4839 generic.go:334] "Generic (PLEG): container finished" podID="befc88a7-caca-450d-b23e-c4382b36217e" containerID="4f8c28d0fe8376d6fb6dbfcfb14d27aa066b107c300eb4d596d4ace562332e7b" exitCode=0 Mar 21 04:40:11 crc kubenswrapper[4839]: I0321 04:40:11.985990 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q2g2" event={"ID":"befc88a7-caca-450d-b23e-c4382b36217e","Type":"ContainerDied","Data":"4f8c28d0fe8376d6fb6dbfcfb14d27aa066b107c300eb4d596d4ace562332e7b"} Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.520176 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m"] Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.521656 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.524779 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-tzl2s" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.532855 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m"] Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.637474 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-bundle\") pod \"e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.637553 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94nth\" (UniqueName: \"kubernetes.io/projected/f63f3493-d532-4d99-94c0-ab8648252dab-kube-api-access-94nth\") pod \"e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.637609 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-util\") pod \"e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.739614 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-bundle\") pod \"e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.740137 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-bundle\") pod \"e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.740220 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94nth\" (UniqueName: \"kubernetes.io/projected/f63f3493-d532-4d99-94c0-ab8648252dab-kube-api-access-94nth\") pod \"e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.740303 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-util\") pod \"e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.741063 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-util\") pod \"e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.763055 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94nth\" (UniqueName: \"kubernetes.io/projected/f63f3493-d532-4d99-94c0-ab8648252dab-kube-api-access-94nth\") pod \"e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:13 crc kubenswrapper[4839]: I0321 04:40:13.839819 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:14 crc kubenswrapper[4839]: I0321 04:40:14.000122 4839 generic.go:334] "Generic (PLEG): container finished" podID="befc88a7-caca-450d-b23e-c4382b36217e" containerID="7dbabf80cccb9957a73998b24ba4a430e810a5d74896e5943d1676694432404b" exitCode=0 Mar 21 04:40:14 crc kubenswrapper[4839]: I0321 04:40:14.000174 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q2g2" event={"ID":"befc88a7-caca-450d-b23e-c4382b36217e","Type":"ContainerDied","Data":"7dbabf80cccb9957a73998b24ba4a430e810a5d74896e5943d1676694432404b"} Mar 21 04:40:14 crc kubenswrapper[4839]: I0321 04:40:14.071360 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m"] Mar 21 04:40:15 crc kubenswrapper[4839]: I0321 04:40:15.009232 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q2g2" event={"ID":"befc88a7-caca-450d-b23e-c4382b36217e","Type":"ContainerStarted","Data":"00e0222729d7c8ca5cf26a69c3049542634e97119934b9dc854c6eadb23842db"} Mar 21 04:40:15 crc kubenswrapper[4839]: I0321 04:40:15.010838 4839 generic.go:334] "Generic (PLEG): container finished" podID="f63f3493-d532-4d99-94c0-ab8648252dab" containerID="57be45a01d1b8d4cd697b274c6ed49d7b6536d5cdf8d1a3add452abc67e651b6" exitCode=0 Mar 21 04:40:15 crc kubenswrapper[4839]: I0321 04:40:15.010863 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" event={"ID":"f63f3493-d532-4d99-94c0-ab8648252dab","Type":"ContainerDied","Data":"57be45a01d1b8d4cd697b274c6ed49d7b6536d5cdf8d1a3add452abc67e651b6"} Mar 21 04:40:15 crc kubenswrapper[4839]: I0321 04:40:15.010892 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" event={"ID":"f63f3493-d532-4d99-94c0-ab8648252dab","Type":"ContainerStarted","Data":"38fdf9bf3572e061d6030fa507bfc73e742e3df2a2a60745bef1f77d03acf33c"} Mar 21 04:40:15 crc kubenswrapper[4839]: I0321 04:40:15.038154 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6q2g2" podStartSLOduration=2.489975513 podStartE2EDuration="5.038138039s" podCreationTimestamp="2026-03-21 04:40:10 +0000 UTC" firstStartedPulling="2026-03-21 04:40:11.987785444 +0000 UTC m=+1016.315572120" lastFinishedPulling="2026-03-21 04:40:14.53594796 +0000 UTC m=+1018.863734646" observedRunningTime="2026-03-21 04:40:15.031865764 +0000 UTC m=+1019.359652440" watchObservedRunningTime="2026-03-21 04:40:15.038138039 +0000 UTC m=+1019.365924705" Mar 21 04:40:16 crc kubenswrapper[4839]: I0321 04:40:16.018826 4839 generic.go:334] "Generic (PLEG): container finished" podID="f63f3493-d532-4d99-94c0-ab8648252dab" containerID="a5341302af03e8f55e21dd4989a0b3c126f401ec6e24ea0e494248009cb0d09c" exitCode=0 Mar 21 04:40:16 crc kubenswrapper[4839]: I0321 04:40:16.018922 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" event={"ID":"f63f3493-d532-4d99-94c0-ab8648252dab","Type":"ContainerDied","Data":"a5341302af03e8f55e21dd4989a0b3c126f401ec6e24ea0e494248009cb0d09c"} Mar 21 04:40:17 crc kubenswrapper[4839]: I0321 04:40:17.027813 4839 generic.go:334] "Generic (PLEG): container finished" podID="f63f3493-d532-4d99-94c0-ab8648252dab" containerID="334a72c7ab7081387e62857481e9ea50d715e161c32f5884fbc232169d834d0a" exitCode=0 Mar 21 04:40:17 crc kubenswrapper[4839]: I0321 04:40:17.027850 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" event={"ID":"f63f3493-d532-4d99-94c0-ab8648252dab","Type":"ContainerDied","Data":"334a72c7ab7081387e62857481e9ea50d715e161c32f5884fbc232169d834d0a"} Mar 21 04:40:18 crc kubenswrapper[4839]: I0321 04:40:18.405755 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:18 crc kubenswrapper[4839]: I0321 04:40:18.502986 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94nth\" (UniqueName: \"kubernetes.io/projected/f63f3493-d532-4d99-94c0-ab8648252dab-kube-api-access-94nth\") pod \"f63f3493-d532-4d99-94c0-ab8648252dab\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " Mar 21 04:40:18 crc kubenswrapper[4839]: I0321 04:40:18.503552 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-bundle\") pod \"f63f3493-d532-4d99-94c0-ab8648252dab\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " Mar 21 04:40:18 crc kubenswrapper[4839]: I0321 04:40:18.503627 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-util\") pod \"f63f3493-d532-4d99-94c0-ab8648252dab\" (UID: \"f63f3493-d532-4d99-94c0-ab8648252dab\") " Mar 21 04:40:18 crc kubenswrapper[4839]: I0321 04:40:18.505796 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-bundle" (OuterVolumeSpecName: "bundle") pod "f63f3493-d532-4d99-94c0-ab8648252dab" (UID: "f63f3493-d532-4d99-94c0-ab8648252dab"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:40:18 crc kubenswrapper[4839]: I0321 04:40:18.512508 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f63f3493-d532-4d99-94c0-ab8648252dab-kube-api-access-94nth" (OuterVolumeSpecName: "kube-api-access-94nth") pod "f63f3493-d532-4d99-94c0-ab8648252dab" (UID: "f63f3493-d532-4d99-94c0-ab8648252dab"). InnerVolumeSpecName "kube-api-access-94nth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:40:18 crc kubenswrapper[4839]: I0321 04:40:18.545204 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-util" (OuterVolumeSpecName: "util") pod "f63f3493-d532-4d99-94c0-ab8648252dab" (UID: "f63f3493-d532-4d99-94c0-ab8648252dab"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:40:18 crc kubenswrapper[4839]: I0321 04:40:18.605231 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94nth\" (UniqueName: \"kubernetes.io/projected/f63f3493-d532-4d99-94c0-ab8648252dab-kube-api-access-94nth\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:18 crc kubenswrapper[4839]: I0321 04:40:18.605265 4839 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:18 crc kubenswrapper[4839]: I0321 04:40:18.605275 4839 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f63f3493-d532-4d99-94c0-ab8648252dab-util\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:19 crc kubenswrapper[4839]: I0321 04:40:19.066344 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" event={"ID":"f63f3493-d532-4d99-94c0-ab8648252dab","Type":"ContainerDied","Data":"38fdf9bf3572e061d6030fa507bfc73e742e3df2a2a60745bef1f77d03acf33c"} Mar 21 04:40:19 crc kubenswrapper[4839]: I0321 04:40:19.066422 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38fdf9bf3572e061d6030fa507bfc73e742e3df2a2a60745bef1f77d03acf33c" Mar 21 04:40:19 crc kubenswrapper[4839]: I0321 04:40:19.066441 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m" Mar 21 04:40:20 crc kubenswrapper[4839]: I0321 04:40:20.423352 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:20 crc kubenswrapper[4839]: I0321 04:40:20.423678 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:20 crc kubenswrapper[4839]: I0321 04:40:20.462366 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.025780 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6"] Mar 21 04:40:21 crc kubenswrapper[4839]: E0321 04:40:21.025995 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63f3493-d532-4d99-94c0-ab8648252dab" containerName="extract" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.026005 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63f3493-d532-4d99-94c0-ab8648252dab" containerName="extract" Mar 21 04:40:21 crc kubenswrapper[4839]: E0321 04:40:21.026015 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63f3493-d532-4d99-94c0-ab8648252dab" containerName="pull" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.026021 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63f3493-d532-4d99-94c0-ab8648252dab" containerName="pull" Mar 21 04:40:21 crc kubenswrapper[4839]: E0321 04:40:21.026039 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63f3493-d532-4d99-94c0-ab8648252dab" containerName="util" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.026046 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63f3493-d532-4d99-94c0-ab8648252dab" containerName="util" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.026145 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f63f3493-d532-4d99-94c0-ab8648252dab" containerName="extract" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.026526 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.029374 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-gwtj7" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.058530 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6"] Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.117667 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.138281 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxn94\" (UniqueName: \"kubernetes.io/projected/b27308cc-b2b7-4bf0-a3ca-55ccdfa47f59-kube-api-access-lxn94\") pod \"openstack-operator-controller-init-948579bb7-j6fx6\" (UID: \"b27308cc-b2b7-4bf0-a3ca-55ccdfa47f59\") " pod="openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.239791 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxn94\" (UniqueName: \"kubernetes.io/projected/b27308cc-b2b7-4bf0-a3ca-55ccdfa47f59-kube-api-access-lxn94\") pod \"openstack-operator-controller-init-948579bb7-j6fx6\" (UID: \"b27308cc-b2b7-4bf0-a3ca-55ccdfa47f59\") " pod="openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.264981 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxn94\" (UniqueName: \"kubernetes.io/projected/b27308cc-b2b7-4bf0-a3ca-55ccdfa47f59-kube-api-access-lxn94\") pod \"openstack-operator-controller-init-948579bb7-j6fx6\" (UID: \"b27308cc-b2b7-4bf0-a3ca-55ccdfa47f59\") " pod="openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.345124 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6" Mar 21 04:40:21 crc kubenswrapper[4839]: I0321 04:40:21.646062 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6"] Mar 21 04:40:22 crc kubenswrapper[4839]: I0321 04:40:22.088373 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6" event={"ID":"b27308cc-b2b7-4bf0-a3ca-55ccdfa47f59","Type":"ContainerStarted","Data":"7989561f519e101d54d9d516a04aec0cbc57fa8e0c963b23185fdd9a51dbf92b"} Mar 21 04:40:22 crc kubenswrapper[4839]: I0321 04:40:22.278445 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6q2g2"] Mar 21 04:40:24 crc kubenswrapper[4839]: I0321 04:40:24.114681 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6q2g2" podUID="befc88a7-caca-450d-b23e-c4382b36217e" containerName="registry-server" containerID="cri-o://00e0222729d7c8ca5cf26a69c3049542634e97119934b9dc854c6eadb23842db" gracePeriod=2 Mar 21 04:40:25 crc kubenswrapper[4839]: I0321 04:40:25.122168 4839 generic.go:334] "Generic (PLEG): container finished" podID="befc88a7-caca-450d-b23e-c4382b36217e" containerID="00e0222729d7c8ca5cf26a69c3049542634e97119934b9dc854c6eadb23842db" exitCode=0 Mar 21 04:40:25 crc kubenswrapper[4839]: I0321 04:40:25.122221 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q2g2" event={"ID":"befc88a7-caca-450d-b23e-c4382b36217e","Type":"ContainerDied","Data":"00e0222729d7c8ca5cf26a69c3049542634e97119934b9dc854c6eadb23842db"} Mar 21 04:40:27 crc kubenswrapper[4839]: I0321 04:40:27.219550 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:27 crc kubenswrapper[4839]: I0321 04:40:27.380548 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-utilities\") pod \"befc88a7-caca-450d-b23e-c4382b36217e\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " Mar 21 04:40:27 crc kubenswrapper[4839]: I0321 04:40:27.380888 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz8c9\" (UniqueName: \"kubernetes.io/projected/befc88a7-caca-450d-b23e-c4382b36217e-kube-api-access-fz8c9\") pod \"befc88a7-caca-450d-b23e-c4382b36217e\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " Mar 21 04:40:27 crc kubenswrapper[4839]: I0321 04:40:27.381036 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-catalog-content\") pod \"befc88a7-caca-450d-b23e-c4382b36217e\" (UID: \"befc88a7-caca-450d-b23e-c4382b36217e\") " Mar 21 04:40:27 crc kubenswrapper[4839]: I0321 04:40:27.381459 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-utilities" (OuterVolumeSpecName: "utilities") pod "befc88a7-caca-450d-b23e-c4382b36217e" (UID: "befc88a7-caca-450d-b23e-c4382b36217e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:40:27 crc kubenswrapper[4839]: I0321 04:40:27.381848 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:27 crc kubenswrapper[4839]: I0321 04:40:27.386594 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/befc88a7-caca-450d-b23e-c4382b36217e-kube-api-access-fz8c9" (OuterVolumeSpecName: "kube-api-access-fz8c9") pod "befc88a7-caca-450d-b23e-c4382b36217e" (UID: "befc88a7-caca-450d-b23e-c4382b36217e"). InnerVolumeSpecName "kube-api-access-fz8c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:40:27 crc kubenswrapper[4839]: I0321 04:40:27.429375 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "befc88a7-caca-450d-b23e-c4382b36217e" (UID: "befc88a7-caca-450d-b23e-c4382b36217e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:40:27 crc kubenswrapper[4839]: I0321 04:40:27.483330 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz8c9\" (UniqueName: \"kubernetes.io/projected/befc88a7-caca-450d-b23e-c4382b36217e-kube-api-access-fz8c9\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:27 crc kubenswrapper[4839]: I0321 04:40:27.483364 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/befc88a7-caca-450d-b23e-c4382b36217e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.148873 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6" event={"ID":"b27308cc-b2b7-4bf0-a3ca-55ccdfa47f59","Type":"ContainerStarted","Data":"9f158af74a72fb5ea00fcf97b3330f16b9c394fcf9fc9baa4365c40374df5d7d"} Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.149256 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6" Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.152519 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6q2g2" event={"ID":"befc88a7-caca-450d-b23e-c4382b36217e","Type":"ContainerDied","Data":"ce9d7fdc03ee552772de04055fb99bd937e17bd28a54562444504527ae42320e"} Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.152604 4839 scope.go:117] "RemoveContainer" containerID="00e0222729d7c8ca5cf26a69c3049542634e97119934b9dc854c6eadb23842db" Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.152711 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6q2g2" Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.175441 4839 scope.go:117] "RemoveContainer" containerID="7dbabf80cccb9957a73998b24ba4a430e810a5d74896e5943d1676694432404b" Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.200127 4839 scope.go:117] "RemoveContainer" containerID="4f8c28d0fe8376d6fb6dbfcfb14d27aa066b107c300eb4d596d4ace562332e7b" Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.203500 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6" podStartSLOduration=1.549891813 podStartE2EDuration="7.203479361s" podCreationTimestamp="2026-03-21 04:40:21 +0000 UTC" firstStartedPulling="2026-03-21 04:40:21.659428482 +0000 UTC m=+1025.987215168" lastFinishedPulling="2026-03-21 04:40:27.31301604 +0000 UTC m=+1031.640802716" observedRunningTime="2026-03-21 04:40:28.200612581 +0000 UTC m=+1032.528399257" watchObservedRunningTime="2026-03-21 04:40:28.203479361 +0000 UTC m=+1032.531266037" Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.216872 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6q2g2"] Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.226748 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6q2g2"] Mar 21 04:40:28 crc kubenswrapper[4839]: I0321 04:40:28.461471 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="befc88a7-caca-450d-b23e-c4382b36217e" path="/var/lib/kubelet/pods/befc88a7-caca-450d-b23e-c4382b36217e/volumes" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.196427 4839 scope.go:117] "RemoveContainer" containerID="28332a1cde28bd0485ad3577e9e484f03df8597359b74118b7173ff71df9e89d" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.479015 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5dcmf"] Mar 21 04:40:30 crc kubenswrapper[4839]: E0321 04:40:30.479611 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="befc88a7-caca-450d-b23e-c4382b36217e" containerName="registry-server" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.479627 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="befc88a7-caca-450d-b23e-c4382b36217e" containerName="registry-server" Mar 21 04:40:30 crc kubenswrapper[4839]: E0321 04:40:30.479640 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="befc88a7-caca-450d-b23e-c4382b36217e" containerName="extract-content" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.479646 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="befc88a7-caca-450d-b23e-c4382b36217e" containerName="extract-content" Mar 21 04:40:30 crc kubenswrapper[4839]: E0321 04:40:30.479658 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="befc88a7-caca-450d-b23e-c4382b36217e" containerName="extract-utilities" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.479664 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="befc88a7-caca-450d-b23e-c4382b36217e" containerName="extract-utilities" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.479768 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="befc88a7-caca-450d-b23e-c4382b36217e" containerName="registry-server" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.480614 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.489355 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5dcmf"] Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.623603 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pblj8\" (UniqueName: \"kubernetes.io/projected/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-kube-api-access-pblj8\") pod \"certified-operators-5dcmf\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.623650 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-utilities\") pod \"certified-operators-5dcmf\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.623733 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-catalog-content\") pod \"certified-operators-5dcmf\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.725160 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-catalog-content\") pod \"certified-operators-5dcmf\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.725233 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pblj8\" (UniqueName: \"kubernetes.io/projected/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-kube-api-access-pblj8\") pod \"certified-operators-5dcmf\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.725259 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-utilities\") pod \"certified-operators-5dcmf\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.725705 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-catalog-content\") pod \"certified-operators-5dcmf\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.725724 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-utilities\") pod \"certified-operators-5dcmf\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.746501 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pblj8\" (UniqueName: \"kubernetes.io/projected/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-kube-api-access-pblj8\") pod \"certified-operators-5dcmf\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:30 crc kubenswrapper[4839]: I0321 04:40:30.798270 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:31 crc kubenswrapper[4839]: I0321 04:40:31.040187 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5dcmf"] Mar 21 04:40:31 crc kubenswrapper[4839]: I0321 04:40:31.172466 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dcmf" event={"ID":"f4edab0e-9c96-42e0-a1e4-20a69c5493d9","Type":"ContainerStarted","Data":"c5f60eb50affb07ac40233821bff40c2440a94b2dbdf1b2ea286ab2eed44dd44"} Mar 21 04:40:32 crc kubenswrapper[4839]: I0321 04:40:32.182359 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerID="76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e" exitCode=0 Mar 21 04:40:32 crc kubenswrapper[4839]: I0321 04:40:32.182726 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dcmf" event={"ID":"f4edab0e-9c96-42e0-a1e4-20a69c5493d9","Type":"ContainerDied","Data":"76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e"} Mar 21 04:40:33 crc kubenswrapper[4839]: I0321 04:40:33.190861 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerID="09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e" exitCode=0 Mar 21 04:40:33 crc kubenswrapper[4839]: I0321 04:40:33.190902 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dcmf" event={"ID":"f4edab0e-9c96-42e0-a1e4-20a69c5493d9","Type":"ContainerDied","Data":"09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e"} Mar 21 04:40:34 crc kubenswrapper[4839]: I0321 04:40:34.198061 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dcmf" event={"ID":"f4edab0e-9c96-42e0-a1e4-20a69c5493d9","Type":"ContainerStarted","Data":"2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc"} Mar 21 04:40:34 crc kubenswrapper[4839]: I0321 04:40:34.217254 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5dcmf" podStartSLOduration=2.800735849 podStartE2EDuration="4.217237946s" podCreationTimestamp="2026-03-21 04:40:30 +0000 UTC" firstStartedPulling="2026-03-21 04:40:32.184358236 +0000 UTC m=+1036.512144932" lastFinishedPulling="2026-03-21 04:40:33.600860353 +0000 UTC m=+1037.928647029" observedRunningTime="2026-03-21 04:40:34.21237563 +0000 UTC m=+1038.540162306" watchObservedRunningTime="2026-03-21 04:40:34.217237946 +0000 UTC m=+1038.545024622" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.676437 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qjmfj"] Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.678185 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.686919 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjmfj"] Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.806132 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98rlh\" (UniqueName: \"kubernetes.io/projected/edae614d-050e-4b5b-afed-f694797a2d8b-kube-api-access-98rlh\") pod \"redhat-marketplace-qjmfj\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.806186 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-utilities\") pod \"redhat-marketplace-qjmfj\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.806235 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-catalog-content\") pod \"redhat-marketplace-qjmfj\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.907634 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98rlh\" (UniqueName: \"kubernetes.io/projected/edae614d-050e-4b5b-afed-f694797a2d8b-kube-api-access-98rlh\") pod \"redhat-marketplace-qjmfj\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.907797 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-utilities\") pod \"redhat-marketplace-qjmfj\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.907872 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-catalog-content\") pod \"redhat-marketplace-qjmfj\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.908333 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-utilities\") pod \"redhat-marketplace-qjmfj\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.908393 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-catalog-content\") pod \"redhat-marketplace-qjmfj\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.926300 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98rlh\" (UniqueName: \"kubernetes.io/projected/edae614d-050e-4b5b-afed-f694797a2d8b-kube-api-access-98rlh\") pod \"redhat-marketplace-qjmfj\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:36 crc kubenswrapper[4839]: I0321 04:40:36.996201 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:37 crc kubenswrapper[4839]: I0321 04:40:37.207500 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjmfj"] Mar 21 04:40:37 crc kubenswrapper[4839]: W0321 04:40:37.213809 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedae614d_050e_4b5b_afed_f694797a2d8b.slice/crio-a16a7b7bb667ce36d9a1deb487ba3a222cb842d5c93e4157daa4a4cc14dd3b84 WatchSource:0}: Error finding container a16a7b7bb667ce36d9a1deb487ba3a222cb842d5c93e4157daa4a4cc14dd3b84: Status 404 returned error can't find the container with id a16a7b7bb667ce36d9a1deb487ba3a222cb842d5c93e4157daa4a4cc14dd3b84 Mar 21 04:40:38 crc kubenswrapper[4839]: I0321 04:40:38.223588 4839 generic.go:334] "Generic (PLEG): container finished" podID="edae614d-050e-4b5b-afed-f694797a2d8b" containerID="22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c" exitCode=0 Mar 21 04:40:38 crc kubenswrapper[4839]: I0321 04:40:38.223652 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjmfj" event={"ID":"edae614d-050e-4b5b-afed-f694797a2d8b","Type":"ContainerDied","Data":"22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c"} Mar 21 04:40:38 crc kubenswrapper[4839]: I0321 04:40:38.223910 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjmfj" event={"ID":"edae614d-050e-4b5b-afed-f694797a2d8b","Type":"ContainerStarted","Data":"a16a7b7bb667ce36d9a1deb487ba3a222cb842d5c93e4157daa4a4cc14dd3b84"} Mar 21 04:40:40 crc kubenswrapper[4839]: I0321 04:40:40.240037 4839 generic.go:334] "Generic (PLEG): container finished" podID="edae614d-050e-4b5b-afed-f694797a2d8b" containerID="63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6" exitCode=0 Mar 21 04:40:40 crc kubenswrapper[4839]: I0321 04:40:40.240192 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjmfj" event={"ID":"edae614d-050e-4b5b-afed-f694797a2d8b","Type":"ContainerDied","Data":"63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6"} Mar 21 04:40:40 crc kubenswrapper[4839]: I0321 04:40:40.799008 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:40 crc kubenswrapper[4839]: I0321 04:40:40.799366 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:40 crc kubenswrapper[4839]: I0321 04:40:40.839265 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:41 crc kubenswrapper[4839]: I0321 04:40:41.248946 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjmfj" event={"ID":"edae614d-050e-4b5b-afed-f694797a2d8b","Type":"ContainerStarted","Data":"8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463"} Mar 21 04:40:41 crc kubenswrapper[4839]: I0321 04:40:41.275232 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qjmfj" podStartSLOduration=2.574380659 podStartE2EDuration="5.275201535s" podCreationTimestamp="2026-03-21 04:40:36 +0000 UTC" firstStartedPulling="2026-03-21 04:40:38.225073817 +0000 UTC m=+1042.552860493" lastFinishedPulling="2026-03-21 04:40:40.925894693 +0000 UTC m=+1045.253681369" observedRunningTime="2026-03-21 04:40:41.267675845 +0000 UTC m=+1045.595462521" watchObservedRunningTime="2026-03-21 04:40:41.275201535 +0000 UTC m=+1045.602988251" Mar 21 04:40:41 crc kubenswrapper[4839]: I0321 04:40:41.289820 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:41 crc kubenswrapper[4839]: I0321 04:40:41.347964 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-948579bb7-j6fx6" Mar 21 04:40:42 crc kubenswrapper[4839]: I0321 04:40:42.671888 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5dcmf"] Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.266002 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5dcmf" podUID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerName="registry-server" containerID="cri-o://2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc" gracePeriod=2 Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.624418 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.807493 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pblj8\" (UniqueName: \"kubernetes.io/projected/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-kube-api-access-pblj8\") pod \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.807674 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-utilities\") pod \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.807729 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-catalog-content\") pod \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\" (UID: \"f4edab0e-9c96-42e0-a1e4-20a69c5493d9\") " Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.808500 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-utilities" (OuterVolumeSpecName: "utilities") pod "f4edab0e-9c96-42e0-a1e4-20a69c5493d9" (UID: "f4edab0e-9c96-42e0-a1e4-20a69c5493d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.820904 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-kube-api-access-pblj8" (OuterVolumeSpecName: "kube-api-access-pblj8") pod "f4edab0e-9c96-42e0-a1e4-20a69c5493d9" (UID: "f4edab0e-9c96-42e0-a1e4-20a69c5493d9"). InnerVolumeSpecName "kube-api-access-pblj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.861257 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4edab0e-9c96-42e0-a1e4-20a69c5493d9" (UID: "f4edab0e-9c96-42e0-a1e4-20a69c5493d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.909660 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.909689 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pblj8\" (UniqueName: \"kubernetes.io/projected/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-kube-api-access-pblj8\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:43 crc kubenswrapper[4839]: I0321 04:40:43.909700 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4edab0e-9c96-42e0-a1e4-20a69c5493d9-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.274490 4839 generic.go:334] "Generic (PLEG): container finished" podID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerID="2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc" exitCode=0 Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.274611 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5dcmf" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.274606 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dcmf" event={"ID":"f4edab0e-9c96-42e0-a1e4-20a69c5493d9","Type":"ContainerDied","Data":"2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc"} Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.274762 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5dcmf" event={"ID":"f4edab0e-9c96-42e0-a1e4-20a69c5493d9","Type":"ContainerDied","Data":"c5f60eb50affb07ac40233821bff40c2440a94b2dbdf1b2ea286ab2eed44dd44"} Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.274787 4839 scope.go:117] "RemoveContainer" containerID="2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.304472 4839 scope.go:117] "RemoveContainer" containerID="09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.314150 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5dcmf"] Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.324991 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5dcmf"] Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.326902 4839 scope.go:117] "RemoveContainer" containerID="76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.358867 4839 scope.go:117] "RemoveContainer" containerID="2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc" Mar 21 04:40:44 crc kubenswrapper[4839]: E0321 04:40:44.362005 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc\": container with ID starting with 2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc not found: ID does not exist" containerID="2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.362043 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc"} err="failed to get container status \"2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc\": rpc error: code = NotFound desc = could not find container \"2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc\": container with ID starting with 2d39b08aec369487f04720788b081c6a69ec46132e51baebda55da37620565dc not found: ID does not exist" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.362069 4839 scope.go:117] "RemoveContainer" containerID="09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e" Mar 21 04:40:44 crc kubenswrapper[4839]: E0321 04:40:44.362486 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e\": container with ID starting with 09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e not found: ID does not exist" containerID="09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.362515 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e"} err="failed to get container status \"09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e\": rpc error: code = NotFound desc = could not find container \"09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e\": container with ID starting with 09ab1301f9e2ee1ffc883bb7c61bb7c49bf4d4738a6c88515d79f190ab72642e not found: ID does not exist" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.362531 4839 scope.go:117] "RemoveContainer" containerID="76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e" Mar 21 04:40:44 crc kubenswrapper[4839]: E0321 04:40:44.362863 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e\": container with ID starting with 76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e not found: ID does not exist" containerID="76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.362884 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e"} err="failed to get container status \"76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e\": rpc error: code = NotFound desc = could not find container \"76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e\": container with ID starting with 76cb461c5bc389f3fddf6611c6e82c1153d3ae4660df3c024829cb1932b42d3e not found: ID does not exist" Mar 21 04:40:44 crc kubenswrapper[4839]: I0321 04:40:44.464389 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" path="/var/lib/kubelet/pods/f4edab0e-9c96-42e0-a1e4-20a69c5493d9/volumes" Mar 21 04:40:46 crc kubenswrapper[4839]: I0321 04:40:46.997117 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:46 crc kubenswrapper[4839]: I0321 04:40:46.998144 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:47 crc kubenswrapper[4839]: I0321 04:40:47.087802 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:47 crc kubenswrapper[4839]: I0321 04:40:47.331935 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:48 crc kubenswrapper[4839]: I0321 04:40:48.272310 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjmfj"] Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.300789 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qjmfj" podUID="edae614d-050e-4b5b-afed-f694797a2d8b" containerName="registry-server" containerID="cri-o://8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463" gracePeriod=2 Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.651269 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.777404 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-catalog-content\") pod \"edae614d-050e-4b5b-afed-f694797a2d8b\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.777502 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-utilities\") pod \"edae614d-050e-4b5b-afed-f694797a2d8b\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.777551 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98rlh\" (UniqueName: \"kubernetes.io/projected/edae614d-050e-4b5b-afed-f694797a2d8b-kube-api-access-98rlh\") pod \"edae614d-050e-4b5b-afed-f694797a2d8b\" (UID: \"edae614d-050e-4b5b-afed-f694797a2d8b\") " Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.778315 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-utilities" (OuterVolumeSpecName: "utilities") pod "edae614d-050e-4b5b-afed-f694797a2d8b" (UID: "edae614d-050e-4b5b-afed-f694797a2d8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.790404 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edae614d-050e-4b5b-afed-f694797a2d8b-kube-api-access-98rlh" (OuterVolumeSpecName: "kube-api-access-98rlh") pod "edae614d-050e-4b5b-afed-f694797a2d8b" (UID: "edae614d-050e-4b5b-afed-f694797a2d8b"). InnerVolumeSpecName "kube-api-access-98rlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.809988 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edae614d-050e-4b5b-afed-f694797a2d8b" (UID: "edae614d-050e-4b5b-afed-f694797a2d8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.879370 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.879399 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edae614d-050e-4b5b-afed-f694797a2d8b-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:49 crc kubenswrapper[4839]: I0321 04:40:49.879408 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98rlh\" (UniqueName: \"kubernetes.io/projected/edae614d-050e-4b5b-afed-f694797a2d8b-kube-api-access-98rlh\") on node \"crc\" DevicePath \"\"" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.307965 4839 generic.go:334] "Generic (PLEG): container finished" podID="edae614d-050e-4b5b-afed-f694797a2d8b" containerID="8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463" exitCode=0 Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.308020 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjmfj" event={"ID":"edae614d-050e-4b5b-afed-f694797a2d8b","Type":"ContainerDied","Data":"8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463"} Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.308053 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qjmfj" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.308083 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjmfj" event={"ID":"edae614d-050e-4b5b-afed-f694797a2d8b","Type":"ContainerDied","Data":"a16a7b7bb667ce36d9a1deb487ba3a222cb842d5c93e4157daa4a4cc14dd3b84"} Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.308111 4839 scope.go:117] "RemoveContainer" containerID="8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.330027 4839 scope.go:117] "RemoveContainer" containerID="63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.339021 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjmfj"] Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.343392 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjmfj"] Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.349628 4839 scope.go:117] "RemoveContainer" containerID="22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.368304 4839 scope.go:117] "RemoveContainer" containerID="8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463" Mar 21 04:40:50 crc kubenswrapper[4839]: E0321 04:40:50.369225 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463\": container with ID starting with 8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463 not found: ID does not exist" containerID="8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.369301 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463"} err="failed to get container status \"8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463\": rpc error: code = NotFound desc = could not find container \"8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463\": container with ID starting with 8d96a607137088fcd99a7eb42a522d1c84d223d8932ab44eb6e0d43d47637463 not found: ID does not exist" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.369347 4839 scope.go:117] "RemoveContainer" containerID="63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6" Mar 21 04:40:50 crc kubenswrapper[4839]: E0321 04:40:50.369892 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6\": container with ID starting with 63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6 not found: ID does not exist" containerID="63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.369945 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6"} err="failed to get container status \"63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6\": rpc error: code = NotFound desc = could not find container \"63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6\": container with ID starting with 63f0a061f3e36b2f811467c7f994b71affd66ca79c0317c1ddad231a315960b6 not found: ID does not exist" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.369963 4839 scope.go:117] "RemoveContainer" containerID="22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c" Mar 21 04:40:50 crc kubenswrapper[4839]: E0321 04:40:50.370326 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c\": container with ID starting with 22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c not found: ID does not exist" containerID="22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.370358 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c"} err="failed to get container status \"22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c\": rpc error: code = NotFound desc = could not find container \"22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c\": container with ID starting with 22bdc0fcd51be93f0ee040bb89784038d3a1acbeb9f3d78dd39bc6607be12b7c not found: ID does not exist" Mar 21 04:40:50 crc kubenswrapper[4839]: I0321 04:40:50.460983 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edae614d-050e-4b5b-afed-f694797a2d8b" path="/var/lib/kubelet/pods/edae614d-050e-4b5b-afed-f694797a2d8b/volumes" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.454557 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz"] Mar 21 04:40:59 crc kubenswrapper[4839]: E0321 04:40:59.455221 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerName="extract-content" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.455234 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerName="extract-content" Mar 21 04:40:59 crc kubenswrapper[4839]: E0321 04:40:59.455246 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerName="extract-utilities" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.455252 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerName="extract-utilities" Mar 21 04:40:59 crc kubenswrapper[4839]: E0321 04:40:59.455258 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerName="registry-server" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.455265 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerName="registry-server" Mar 21 04:40:59 crc kubenswrapper[4839]: E0321 04:40:59.455276 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edae614d-050e-4b5b-afed-f694797a2d8b" containerName="registry-server" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.455282 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="edae614d-050e-4b5b-afed-f694797a2d8b" containerName="registry-server" Mar 21 04:40:59 crc kubenswrapper[4839]: E0321 04:40:59.455293 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edae614d-050e-4b5b-afed-f694797a2d8b" containerName="extract-utilities" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.455299 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="edae614d-050e-4b5b-afed-f694797a2d8b" containerName="extract-utilities" Mar 21 04:40:59 crc kubenswrapper[4839]: E0321 04:40:59.455311 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edae614d-050e-4b5b-afed-f694797a2d8b" containerName="extract-content" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.455317 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="edae614d-050e-4b5b-afed-f694797a2d8b" containerName="extract-content" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.455425 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4edab0e-9c96-42e0-a1e4-20a69c5493d9" containerName="registry-server" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.455443 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="edae614d-050e-4b5b-afed-f694797a2d8b" containerName="registry-server" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.455844 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.458050 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-ww6sd" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.459072 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.459973 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.461494 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-dw2zz" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.472346 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.480209 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.481165 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.482581 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-k62gw" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.487689 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.508720 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.523227 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.524355 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.524741 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fztr\" (UniqueName: \"kubernetes.io/projected/05f30a88-e899-4727-9440-981d010a1342-kube-api-access-5fztr\") pod \"cinder-operator-controller-manager-8d58dc466-dncxc\" (UID: \"05f30a88-e899-4727-9440-981d010a1342\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.524891 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njspv\" (UniqueName: \"kubernetes.io/projected/0c51ffa0-2285-4f7e-af09-0cafba139934-kube-api-access-njspv\") pod \"barbican-operator-controller-manager-59bc569d95-2mkmz\" (UID: \"0c51ffa0-2285-4f7e-af09-0cafba139934\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.525100 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvkpt\" (UniqueName: \"kubernetes.io/projected/ee9d64a7-0d03-4cb0-a266-47b26f9957b5-kube-api-access-lvkpt\") pod \"designate-operator-controller-manager-588d4d986b-9s4vt\" (UID: \"ee9d64a7-0d03-4cb0-a266-47b26f9957b5\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.529475 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.533478 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-mtcqf" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.535762 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.536919 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.543939 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-r6l7p" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.559639 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.581682 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.582534 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.598889 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-t2d4l" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.628861 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfn9l\" (UniqueName: \"kubernetes.io/projected/fd731e7e-440b-4e77-a778-08a4a62e0c9f-kube-api-access-wfn9l\") pod \"heat-operator-controller-manager-67dd5f86f5-2n27d\" (UID: \"fd731e7e-440b-4e77-a778-08a4a62e0c9f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.628925 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh8j8\" (UniqueName: \"kubernetes.io/projected/acb1d7ac-b3f9-4564-8346-344ffb5c3964-kube-api-access-kh8j8\") pod \"horizon-operator-controller-manager-8464cc45fb-d7h7r\" (UID: \"acb1d7ac-b3f9-4564-8346-344ffb5c3964\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.628954 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njspv\" (UniqueName: \"kubernetes.io/projected/0c51ffa0-2285-4f7e-af09-0cafba139934-kube-api-access-njspv\") pod \"barbican-operator-controller-manager-59bc569d95-2mkmz\" (UID: \"0c51ffa0-2285-4f7e-af09-0cafba139934\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.629015 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvkpt\" (UniqueName: \"kubernetes.io/projected/ee9d64a7-0d03-4cb0-a266-47b26f9957b5-kube-api-access-lvkpt\") pod \"designate-operator-controller-manager-588d4d986b-9s4vt\" (UID: \"ee9d64a7-0d03-4cb0-a266-47b26f9957b5\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.629047 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blk4c\" (UniqueName: \"kubernetes.io/projected/d3dc722f-f66c-46a0-9b1a-ae1b9c4de060-kube-api-access-blk4c\") pod \"glance-operator-controller-manager-79df6bcc97-6s6q7\" (UID: \"d3dc722f-f66c-46a0-9b1a-ae1b9c4de060\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.629082 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fztr\" (UniqueName: \"kubernetes.io/projected/05f30a88-e899-4727-9440-981d010a1342-kube-api-access-5fztr\") pod \"cinder-operator-controller-manager-8d58dc466-dncxc\" (UID: \"05f30a88-e899-4727-9440-981d010a1342\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.668882 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.669041 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvkpt\" (UniqueName: \"kubernetes.io/projected/ee9d64a7-0d03-4cb0-a266-47b26f9957b5-kube-api-access-lvkpt\") pod \"designate-operator-controller-manager-588d4d986b-9s4vt\" (UID: \"ee9d64a7-0d03-4cb0-a266-47b26f9957b5\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.670432 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.674164 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fztr\" (UniqueName: \"kubernetes.io/projected/05f30a88-e899-4727-9440-981d010a1342-kube-api-access-5fztr\") pod \"cinder-operator-controller-manager-8d58dc466-dncxc\" (UID: \"05f30a88-e899-4727-9440-981d010a1342\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.674729 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-6p58z" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.683738 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.684964 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.693748 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-m29j8" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.693933 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.694538 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njspv\" (UniqueName: \"kubernetes.io/projected/0c51ffa0-2285-4f7e-af09-0cafba139934-kube-api-access-njspv\") pod \"barbican-operator-controller-manager-59bc569d95-2mkmz\" (UID: \"0c51ffa0-2285-4f7e-af09-0cafba139934\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.704554 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.707589 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.712830 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.713879 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.718920 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.725979 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-m5jtl" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.726695 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.727588 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.732372 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blk4c\" (UniqueName: \"kubernetes.io/projected/d3dc722f-f66c-46a0-9b1a-ae1b9c4de060-kube-api-access-blk4c\") pod \"glance-operator-controller-manager-79df6bcc97-6s6q7\" (UID: \"d3dc722f-f66c-46a0-9b1a-ae1b9c4de060\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.732433 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.732481 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdzgg\" (UniqueName: \"kubernetes.io/projected/7a7bf7a3-acea-4059-8a89-db576f3588d1-kube-api-access-sdzgg\") pod \"keystone-operator-controller-manager-768b96df4c-k4lg5\" (UID: \"7a7bf7a3-acea-4059-8a89-db576f3588d1\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.732512 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvp89\" (UniqueName: \"kubernetes.io/projected/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-kube-api-access-lvp89\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.732548 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dch45\" (UniqueName: \"kubernetes.io/projected/ccec0d11-294b-43a2-be2e-fcef8a6818c6-kube-api-access-dch45\") pod \"ironic-operator-controller-manager-6f787dddc9-8sg4d\" (UID: \"ccec0d11-294b-43a2-be2e-fcef8a6818c6\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.732620 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfn9l\" (UniqueName: \"kubernetes.io/projected/fd731e7e-440b-4e77-a778-08a4a62e0c9f-kube-api-access-wfn9l\") pod \"heat-operator-controller-manager-67dd5f86f5-2n27d\" (UID: \"fd731e7e-440b-4e77-a778-08a4a62e0c9f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.732655 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh8j8\" (UniqueName: \"kubernetes.io/projected/acb1d7ac-b3f9-4564-8346-344ffb5c3964-kube-api-access-kh8j8\") pod \"horizon-operator-controller-manager-8464cc45fb-d7h7r\" (UID: \"acb1d7ac-b3f9-4564-8346-344ffb5c3964\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.733527 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-kw7js" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.746638 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.773728 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.774621 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blk4c\" (UniqueName: \"kubernetes.io/projected/d3dc722f-f66c-46a0-9b1a-ae1b9c4de060-kube-api-access-blk4c\") pod \"glance-operator-controller-manager-79df6bcc97-6s6q7\" (UID: \"d3dc722f-f66c-46a0-9b1a-ae1b9c4de060\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.774933 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.780226 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfn9l\" (UniqueName: \"kubernetes.io/projected/fd731e7e-440b-4e77-a778-08a4a62e0c9f-kube-api-access-wfn9l\") pod \"heat-operator-controller-manager-67dd5f86f5-2n27d\" (UID: \"fd731e7e-440b-4e77-a778-08a4a62e0c9f\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.784929 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.792787 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.793703 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh8j8\" (UniqueName: \"kubernetes.io/projected/acb1d7ac-b3f9-4564-8346-344ffb5c3964-kube-api-access-kh8j8\") pod \"horizon-operator-controller-manager-8464cc45fb-d7h7r\" (UID: \"acb1d7ac-b3f9-4564-8346-344ffb5c3964\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.794193 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-94vpf"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.794911 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.795552 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.800021 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.810137 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-w7mps" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.810543 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-65hn8" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.815876 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.820268 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.821488 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.826656 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-c7dml" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.838971 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvp89\" (UniqueName: \"kubernetes.io/projected/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-kube-api-access-lvp89\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.839041 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rjxn\" (UniqueName: \"kubernetes.io/projected/6074766c-0ecd-4051-a676-dcc21b24184f-kube-api-access-6rjxn\") pod \"manila-operator-controller-manager-55f864c847-gzh8j\" (UID: \"6074766c-0ecd-4051-a676-dcc21b24184f\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.839084 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dch45\" (UniqueName: \"kubernetes.io/projected/ccec0d11-294b-43a2-be2e-fcef8a6818c6-kube-api-access-dch45\") pod \"ironic-operator-controller-manager-6f787dddc9-8sg4d\" (UID: \"ccec0d11-294b-43a2-be2e-fcef8a6818c6\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.839119 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcpnh\" (UniqueName: \"kubernetes.io/projected/70702cd5-6815-4a01-98a4-2f4dfaeef839-kube-api-access-tcpnh\") pod \"neutron-operator-controller-manager-767865f676-94vpf\" (UID: \"70702cd5-6815-4a01-98a4-2f4dfaeef839\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.839164 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfbjv\" (UniqueName: \"kubernetes.io/projected/2162bafb-7e49-435c-9591-d8b725f10336-kube-api-access-xfbjv\") pod \"mariadb-operator-controller-manager-67ccfc9778-sp4j4\" (UID: \"2162bafb-7e49-435c-9591-d8b725f10336\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.839217 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6648r\" (UniqueName: \"kubernetes.io/projected/6914418f-3639-4ebc-a58d-d8b478cbf6b4-kube-api-access-6648r\") pod \"nova-operator-controller-manager-5d488d59fb-wjw9j\" (UID: \"6914418f-3639-4ebc-a58d-d8b478cbf6b4\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.839242 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.839279 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdzgg\" (UniqueName: \"kubernetes.io/projected/7a7bf7a3-acea-4059-8a89-db576f3588d1-kube-api-access-sdzgg\") pod \"keystone-operator-controller-manager-768b96df4c-k4lg5\" (UID: \"7a7bf7a3-acea-4059-8a89-db576f3588d1\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" Mar 21 04:40:59 crc kubenswrapper[4839]: E0321 04:40:59.840153 4839 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 04:40:59 crc kubenswrapper[4839]: E0321 04:40:59.840218 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert podName:ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b nodeName:}" failed. No retries permitted until 2026-03-21 04:41:00.340195655 +0000 UTC m=+1064.667982331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert") pod "infra-operator-controller-manager-7b9c774f96-bsdjs" (UID: "ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b") : secret "infra-operator-webhook-server-cert" not found Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.847477 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.860654 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.862524 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.863266 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.870159 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.884352 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdzgg\" (UniqueName: \"kubernetes.io/projected/7a7bf7a3-acea-4059-8a89-db576f3588d1-kube-api-access-sdzgg\") pod \"keystone-operator-controller-manager-768b96df4c-k4lg5\" (UID: \"7a7bf7a3-acea-4059-8a89-db576f3588d1\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.888509 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-w2xjn" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.902454 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dch45\" (UniqueName: \"kubernetes.io/projected/ccec0d11-294b-43a2-be2e-fcef8a6818c6-kube-api-access-dch45\") pod \"ironic-operator-controller-manager-6f787dddc9-8sg4d\" (UID: \"ccec0d11-294b-43a2-be2e-fcef8a6818c6\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.915764 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvp89\" (UniqueName: \"kubernetes.io/projected/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-kube-api-access-lvp89\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.938323 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.941424 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6648r\" (UniqueName: \"kubernetes.io/projected/6914418f-3639-4ebc-a58d-d8b478cbf6b4-kube-api-access-6648r\") pod \"nova-operator-controller-manager-5d488d59fb-wjw9j\" (UID: \"6914418f-3639-4ebc-a58d-d8b478cbf6b4\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.941528 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rjxn\" (UniqueName: \"kubernetes.io/projected/6074766c-0ecd-4051-a676-dcc21b24184f-kube-api-access-6rjxn\") pod \"manila-operator-controller-manager-55f864c847-gzh8j\" (UID: \"6074766c-0ecd-4051-a676-dcc21b24184f\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.957728 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcpnh\" (UniqueName: \"kubernetes.io/projected/70702cd5-6815-4a01-98a4-2f4dfaeef839-kube-api-access-tcpnh\") pod \"neutron-operator-controller-manager-767865f676-94vpf\" (UID: \"70702cd5-6815-4a01-98a4-2f4dfaeef839\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.957888 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfbjv\" (UniqueName: \"kubernetes.io/projected/2162bafb-7e49-435c-9591-d8b725f10336-kube-api-access-xfbjv\") pod \"mariadb-operator-controller-manager-67ccfc9778-sp4j4\" (UID: \"2162bafb-7e49-435c-9591-d8b725f10336\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.957929 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcn8p\" (UniqueName: \"kubernetes.io/projected/faac458b-73d9-4fb8-9f1c-50f7521088b0-kube-api-access-kcn8p\") pod \"octavia-operator-controller-manager-5b9f45d989-6p4mn\" (UID: \"faac458b-73d9-4fb8-9f1c-50f7521088b0\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn" Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.967627 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-94vpf"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.985185 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn"] Mar 21 04:40:59 crc kubenswrapper[4839]: I0321 04:40:59.999552 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfbjv\" (UniqueName: \"kubernetes.io/projected/2162bafb-7e49-435c-9591-d8b725f10336-kube-api-access-xfbjv\") pod \"mariadb-operator-controller-manager-67ccfc9778-sp4j4\" (UID: \"2162bafb-7e49-435c-9591-d8b725f10336\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.001596 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcpnh\" (UniqueName: \"kubernetes.io/projected/70702cd5-6815-4a01-98a4-2f4dfaeef839-kube-api-access-tcpnh\") pod \"neutron-operator-controller-manager-767865f676-94vpf\" (UID: \"70702cd5-6815-4a01-98a4-2f4dfaeef839\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.002653 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6648r\" (UniqueName: \"kubernetes.io/projected/6914418f-3639-4ebc-a58d-d8b478cbf6b4-kube-api-access-6648r\") pod \"nova-operator-controller-manager-5d488d59fb-wjw9j\" (UID: \"6914418f-3639-4ebc-a58d-d8b478cbf6b4\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.003864 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.012246 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rjxn\" (UniqueName: \"kubernetes.io/projected/6074766c-0ecd-4051-a676-dcc21b24184f-kube-api-access-6rjxn\") pod \"manila-operator-controller-manager-55f864c847-gzh8j\" (UID: \"6074766c-0ecd-4051-a676-dcc21b24184f\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.024603 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.026408 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.031634 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.031677 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-m6g24" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.041672 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-qt58c"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.042768 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.044754 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.058098 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-js6r2" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.066489 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtzcv\" (UniqueName: \"kubernetes.io/projected/859b11bc-e9fb-40a2-a053-66a07337965c-kube-api-access-rtzcv\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.066589 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcn8p\" (UniqueName: \"kubernetes.io/projected/faac458b-73d9-4fb8-9f1c-50f7521088b0-kube-api-access-kcn8p\") pod \"octavia-operator-controller-manager-5b9f45d989-6p4mn\" (UID: \"faac458b-73d9-4fb8-9f1c-50f7521088b0\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.066618 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.066641 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdqxk\" (UniqueName: \"kubernetes.io/projected/379b40a1-e3f5-448b-b668-0f168457e5d0-kube-api-access-bdqxk\") pod \"ovn-operator-controller-manager-884679f54-qt58c\" (UID: \"379b40a1-e3f5-448b-b668-0f168457e5d0\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.113155 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-qt58c"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.122291 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcn8p\" (UniqueName: \"kubernetes.io/projected/faac458b-73d9-4fb8-9f1c-50f7521088b0-kube-api-access-kcn8p\") pod \"octavia-operator-controller-manager-5b9f45d989-6p4mn\" (UID: \"faac458b-73d9-4fb8-9f1c-50f7521088b0\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.150348 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.165840 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-x75fd"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.166689 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-x75fd" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.168170 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.169474 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtzcv\" (UniqueName: \"kubernetes.io/projected/859b11bc-e9fb-40a2-a053-66a07337965c-kube-api-access-rtzcv\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.169592 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.169625 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdqxk\" (UniqueName: \"kubernetes.io/projected/379b40a1-e3f5-448b-b668-0f168457e5d0-kube-api-access-bdqxk\") pod \"ovn-operator-controller-manager-884679f54-qt58c\" (UID: \"379b40a1-e3f5-448b-b668-0f168457e5d0\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.176032 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-n9vx9" Mar 21 04:41:00 crc kubenswrapper[4839]: E0321 04:41:00.176655 4839 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 04:41:00 crc kubenswrapper[4839]: E0321 04:41:00.176712 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert podName:859b11bc-e9fb-40a2-a053-66a07337965c nodeName:}" failed. No retries permitted until 2026-03-21 04:41:00.676695439 +0000 UTC m=+1065.004482115 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-8gc22" (UID: "859b11bc-e9fb-40a2-a053-66a07337965c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.188495 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.189475 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.197689 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.203866 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-l5ks9" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.223745 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdqxk\" (UniqueName: \"kubernetes.io/projected/379b40a1-e3f5-448b-b668-0f168457e5d0-kube-api-access-bdqxk\") pod \"ovn-operator-controller-manager-884679f54-qt58c\" (UID: \"379b40a1-e3f5-448b-b668-0f168457e5d0\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.224349 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.240775 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtzcv\" (UniqueName: \"kubernetes.io/projected/859b11bc-e9fb-40a2-a053-66a07337965c-kube-api-access-rtzcv\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.267835 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.296067 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfvwt\" (UniqueName: \"kubernetes.io/projected/361c2d7b-9a75-41fd-953d-4b1bd64ca6df-kube-api-access-rfvwt\") pod \"placement-operator-controller-manager-5784578c99-x75fd\" (UID: \"361c2d7b-9a75-41fd-953d-4b1bd64ca6df\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-x75fd" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.296104 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4shpf\" (UniqueName: \"kubernetes.io/projected/2045f5d2-c67e-47cd-b16d-3c69d449f099-kube-api-access-4shpf\") pod \"swift-operator-controller-manager-c674c5965-xt7xt\" (UID: \"2045f5d2-c67e-47cd-b16d-3c69d449f099\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.325621 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-x75fd"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.346032 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.360079 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.367161 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.370434 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.372714 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-59xc6" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.397064 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfvwt\" (UniqueName: \"kubernetes.io/projected/361c2d7b-9a75-41fd-953d-4b1bd64ca6df-kube-api-access-rfvwt\") pod \"placement-operator-controller-manager-5784578c99-x75fd\" (UID: \"361c2d7b-9a75-41fd-953d-4b1bd64ca6df\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-x75fd" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.397106 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4shpf\" (UniqueName: \"kubernetes.io/projected/2045f5d2-c67e-47cd-b16d-3c69d449f099-kube-api-access-4shpf\") pod \"swift-operator-controller-manager-c674c5965-xt7xt\" (UID: \"2045f5d2-c67e-47cd-b16d-3c69d449f099\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.397127 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:41:00 crc kubenswrapper[4839]: E0321 04:41:00.397294 4839 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 04:41:00 crc kubenswrapper[4839]: E0321 04:41:00.397342 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert podName:ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b nodeName:}" failed. No retries permitted until 2026-03-21 04:41:01.397327631 +0000 UTC m=+1065.725114307 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert") pod "infra-operator-controller-manager-7b9c774f96-bsdjs" (UID: "ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b") : secret "infra-operator-webhook-server-cert" not found Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.405559 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.416447 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.416920 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4shpf\" (UniqueName: \"kubernetes.io/projected/2045f5d2-c67e-47cd-b16d-3c69d449f099-kube-api-access-4shpf\") pod \"swift-operator-controller-manager-c674c5965-xt7xt\" (UID: \"2045f5d2-c67e-47cd-b16d-3c69d449f099\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.417796 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.423879 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-cgxcz" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.425319 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.431430 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfvwt\" (UniqueName: \"kubernetes.io/projected/361c2d7b-9a75-41fd-953d-4b1bd64ca6df-kube-api-access-rfvwt\") pod \"placement-operator-controller-manager-5784578c99-x75fd\" (UID: \"361c2d7b-9a75-41fd-953d-4b1bd64ca6df\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-x75fd" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.434924 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.437502 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.439654 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-v8qj9" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.440936 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.443758 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.499335 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzgnq\" (UniqueName: \"kubernetes.io/projected/1d32b541-7b80-492b-adac-e51d5090b668-kube-api-access-bzgnq\") pod \"watcher-operator-controller-manager-6c4d75f7f9-hh27s\" (UID: \"1d32b541-7b80-492b-adac-e51d5090b668\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.499412 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx8b8\" (UniqueName: \"kubernetes.io/projected/d3ea9c2e-11a4-492e-9e84-8294e81ce775-kube-api-access-bx8b8\") pod \"telemetry-operator-controller-manager-d6b694c5-btkvt\" (UID: \"d3ea9c2e-11a4-492e-9e84-8294e81ce775\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.499442 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hwvx\" (UniqueName: \"kubernetes.io/projected/5eeb53bd-3988-458f-baa5-d265e0178aea-kube-api-access-2hwvx\") pod \"test-operator-controller-manager-5c5cb9c4d7-7f4qh\" (UID: \"5eeb53bd-3988-458f-baa5-d265e0178aea\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.518130 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.519106 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.519130 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.519859 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.519949 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.520318 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.523400 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vcnt8" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.523530 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-wrpz4" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.523669 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.523865 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.524517 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-x75fd" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.545682 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.601294 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hwvx\" (UniqueName: \"kubernetes.io/projected/5eeb53bd-3988-458f-baa5-d265e0178aea-kube-api-access-2hwvx\") pod \"test-operator-controller-manager-5c5cb9c4d7-7f4qh\" (UID: \"5eeb53bd-3988-458f-baa5-d265e0178aea\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.601922 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.601964 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.602002 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgnfs\" (UniqueName: \"kubernetes.io/projected/c8584ecb-dc92-4cec-9178-3017f09095da-kube-api-access-mgnfs\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lzbtt\" (UID: \"c8584ecb-dc92-4cec-9178-3017f09095da\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.602026 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qfsq\" (UniqueName: \"kubernetes.io/projected/06f9e67e-8978-46a1-9dc8-c511197241e2-kube-api-access-2qfsq\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.602050 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzgnq\" (UniqueName: \"kubernetes.io/projected/1d32b541-7b80-492b-adac-e51d5090b668-kube-api-access-bzgnq\") pod \"watcher-operator-controller-manager-6c4d75f7f9-hh27s\" (UID: \"1d32b541-7b80-492b-adac-e51d5090b668\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.602095 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx8b8\" (UniqueName: \"kubernetes.io/projected/d3ea9c2e-11a4-492e-9e84-8294e81ce775-kube-api-access-bx8b8\") pod \"telemetry-operator-controller-manager-d6b694c5-btkvt\" (UID: \"d3ea9c2e-11a4-492e-9e84-8294e81ce775\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.622304 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hwvx\" (UniqueName: \"kubernetes.io/projected/5eeb53bd-3988-458f-baa5-d265e0178aea-kube-api-access-2hwvx\") pod \"test-operator-controller-manager-5c5cb9c4d7-7f4qh\" (UID: \"5eeb53bd-3988-458f-baa5-d265e0178aea\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.623722 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzgnq\" (UniqueName: \"kubernetes.io/projected/1d32b541-7b80-492b-adac-e51d5090b668-kube-api-access-bzgnq\") pod \"watcher-operator-controller-manager-6c4d75f7f9-hh27s\" (UID: \"1d32b541-7b80-492b-adac-e51d5090b668\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.627650 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.634871 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx8b8\" (UniqueName: \"kubernetes.io/projected/d3ea9c2e-11a4-492e-9e84-8294e81ce775-kube-api-access-bx8b8\") pod \"telemetry-operator-controller-manager-d6b694c5-btkvt\" (UID: \"d3ea9c2e-11a4-492e-9e84-8294e81ce775\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.704523 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.704619 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.704672 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.704712 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgnfs\" (UniqueName: \"kubernetes.io/projected/c8584ecb-dc92-4cec-9178-3017f09095da-kube-api-access-mgnfs\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lzbtt\" (UID: \"c8584ecb-dc92-4cec-9178-3017f09095da\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.704735 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qfsq\" (UniqueName: \"kubernetes.io/projected/06f9e67e-8978-46a1-9dc8-c511197241e2-kube-api-access-2qfsq\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:00 crc kubenswrapper[4839]: E0321 04:41:00.705618 4839 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 04:41:00 crc kubenswrapper[4839]: E0321 04:41:00.705685 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs podName:06f9e67e-8978-46a1-9dc8-c511197241e2 nodeName:}" failed. No retries permitted until 2026-03-21 04:41:01.205669377 +0000 UTC m=+1065.533456053 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs") pod "openstack-operator-controller-manager-5ccd4855ff-jx6pn" (UID: "06f9e67e-8978-46a1-9dc8-c511197241e2") : secret "metrics-server-cert" not found Mar 21 04:41:00 crc kubenswrapper[4839]: E0321 04:41:00.705890 4839 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 04:41:00 crc kubenswrapper[4839]: E0321 04:41:00.705912 4839 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 04:41:00 crc kubenswrapper[4839]: E0321 04:41:00.705920 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs podName:06f9e67e-8978-46a1-9dc8-c511197241e2 nodeName:}" failed. No retries permitted until 2026-03-21 04:41:01.205912324 +0000 UTC m=+1065.533698990 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs") pod "openstack-operator-controller-manager-5ccd4855ff-jx6pn" (UID: "06f9e67e-8978-46a1-9dc8-c511197241e2") : secret "webhook-server-cert" not found Mar 21 04:41:00 crc kubenswrapper[4839]: E0321 04:41:00.705939 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert podName:859b11bc-e9fb-40a2-a053-66a07337965c nodeName:}" failed. No retries permitted until 2026-03-21 04:41:01.705931835 +0000 UTC m=+1066.033718511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-8gc22" (UID: "859b11bc-e9fb-40a2-a053-66a07337965c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.738381 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qfsq\" (UniqueName: \"kubernetes.io/projected/06f9e67e-8978-46a1-9dc8-c511197241e2-kube-api-access-2qfsq\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.746042 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgnfs\" (UniqueName: \"kubernetes.io/projected/c8584ecb-dc92-4cec-9178-3017f09095da-kube-api-access-mgnfs\") pod \"rabbitmq-cluster-operator-manager-668c99d594-lzbtt\" (UID: \"c8584ecb-dc92-4cec-9178-3017f09095da\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.791979 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.803523 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.808155 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt"] Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.820120 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.824700 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.869765 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" Mar 21 04:41:00 crc kubenswrapper[4839]: I0321 04:41:00.970186 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.002756 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.011744 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j"] Mar 21 04:41:01 crc kubenswrapper[4839]: W0321 04:41:01.039551 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3dc722f_f66c_46a0_9b1a_ae1b9c4de060.slice/crio-25b27b660f9d026738b89fe32f29d9d05c7c096d139f4969eccedc4a0ef1cdfc WatchSource:0}: Error finding container 25b27b660f9d026738b89fe32f29d9d05c7c096d139f4969eccedc4a0ef1cdfc: Status 404 returned error can't find the container with id 25b27b660f9d026738b89fe32f29d9d05c7c096d139f4969eccedc4a0ef1cdfc Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.049039 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d"] Mar 21 04:41:01 crc kubenswrapper[4839]: W0321 04:41:01.174821 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a7bf7a3_acea_4059_8a89_db576f3588d1.slice/crio-95aaef1836ce9a9bef4c03957baa597a4869021c94cede1031514f565dbdf645 WatchSource:0}: Error finding container 95aaef1836ce9a9bef4c03957baa597a4869021c94cede1031514f565dbdf645: Status 404 returned error can't find the container with id 95aaef1836ce9a9bef4c03957baa597a4869021c94cede1031514f565dbdf645 Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.177502 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.190242 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.225376 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.226012 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.225916 4839 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.226225 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs podName:06f9e67e-8978-46a1-9dc8-c511197241e2 nodeName:}" failed. No retries permitted until 2026-03-21 04:41:02.22621022 +0000 UTC m=+1066.553996886 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs") pod "openstack-operator-controller-manager-5ccd4855ff-jx6pn" (UID: "06f9e67e-8978-46a1-9dc8-c511197241e2") : secret "metrics-server-cert" not found Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.226178 4839 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.226462 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs podName:06f9e67e-8978-46a1-9dc8-c511197241e2 nodeName:}" failed. No retries permitted until 2026-03-21 04:41:02.226431696 +0000 UTC m=+1066.554218372 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs") pod "openstack-operator-controller-manager-5ccd4855ff-jx6pn" (UID: "06f9e67e-8978-46a1-9dc8-c511197241e2") : secret "webhook-server-cert" not found Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.317603 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-94vpf"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.381072 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-x75fd"] Mar 21 04:41:01 crc kubenswrapper[4839]: W0321 04:41:01.381196 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod361c2d7b_9a75_41fd_953d_4b1bd64ca6df.slice/crio-484978de3436726b3f864212d1e8ec9df3ea28eebf62b790ce717e22ea6d0465 WatchSource:0}: Error finding container 484978de3436726b3f864212d1e8ec9df3ea28eebf62b790ce717e22ea6d0465: Status 404 returned error can't find the container with id 484978de3436726b3f864212d1e8ec9df3ea28eebf62b790ce717e22ea6d0465 Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.401584 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.425470 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.430554 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.430874 4839 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.430931 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert podName:ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b nodeName:}" failed. No retries permitted until 2026-03-21 04:41:03.430913906 +0000 UTC m=+1067.758700582 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert") pod "infra-operator-controller-manager-7b9c774f96-bsdjs" (UID: "ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b") : secret "infra-operator-webhook-server-cert" not found Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.466813 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xfbjv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67ccfc9778-sp4j4_openstack-operators(2162bafb-7e49-435c-9591-d8b725f10336): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.468416 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" podUID="2162bafb-7e49-435c-9591-d8b725f10336" Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.474867 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.481443 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz" event={"ID":"0c51ffa0-2285-4f7e-af09-0cafba139934","Type":"ContainerStarted","Data":"f178a46d5304a2344a0cf7e88fe08339ae279ffac5a78564c6d46cc77ef8989c"} Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.483144 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.549866 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d" event={"ID":"fd731e7e-440b-4e77-a778-08a4a62e0c9f","Type":"ContainerStarted","Data":"187fc773e622271ea9dd4774a4a961f54233a950e74d2da6ba9f5a9b080c16eb"} Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.580870 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-x75fd" event={"ID":"361c2d7b-9a75-41fd-953d-4b1bd64ca6df","Type":"ContainerStarted","Data":"484978de3436726b3f864212d1e8ec9df3ea28eebf62b790ce717e22ea6d0465"} Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.595029 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt" event={"ID":"ee9d64a7-0d03-4cb0-a266-47b26f9957b5","Type":"ContainerStarted","Data":"b57c63613e5e99cde07bf4587e39393a9651f73f2d70628187580237ce767ec1"} Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.613275 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.615741 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" event={"ID":"7a7bf7a3-acea-4059-8a89-db576f3588d1","Type":"ContainerStarted","Data":"95aaef1836ce9a9bef4c03957baa597a4869021c94cede1031514f565dbdf645"} Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.623262 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r" event={"ID":"acb1d7ac-b3f9-4564-8346-344ffb5c3964","Type":"ContainerStarted","Data":"d11080250b691496b64dace2092d9ed99da1740c09542dd8a197140613a4549b"} Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.648662 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt"] Mar 21 04:41:01 crc kubenswrapper[4839]: W0321 04:41:01.653667 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d32b541_7b80_492b_adac_e51d5090b668.slice/crio-5f3ab0be0781d2c50b7806355240791ed2d46f4185d1588c0117d51abce0987d WatchSource:0}: Error finding container 5f3ab0be0781d2c50b7806355240791ed2d46f4185d1588c0117d51abce0987d: Status 404 returned error can't find the container with id 5f3ab0be0781d2c50b7806355240791ed2d46f4185d1588c0117d51abce0987d Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.663627 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-qt58c"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.666096 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.681725 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j" event={"ID":"6074766c-0ecd-4051-a676-dcc21b24184f","Type":"ContainerStarted","Data":"8a0e49009225321e3e5163ca7f552fd97c178ddb4ba50dc2360e71b295cda662"} Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.681881 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mgnfs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-lzbtt_openstack-operators(c8584ecb-dc92-4cec-9178-3017f09095da): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.683137 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" podUID="c8584ecb-dc92-4cec-9178-3017f09095da" Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.684103 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" event={"ID":"6914418f-3639-4ebc-a58d-d8b478cbf6b4","Type":"ContainerStarted","Data":"84d5fe1899628facaa4eb033948b29c1621a64d3687adbb83a4011a9d37b203e"} Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.689481 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" event={"ID":"05f30a88-e899-4727-9440-981d010a1342","Type":"ContainerStarted","Data":"0ab49986c77037fd9f16877166b7ad114b8319308b9c26305ab3284dd8b48804"} Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.691511 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d" event={"ID":"ccec0d11-294b-43a2-be2e-fcef8a6818c6","Type":"ContainerStarted","Data":"09b1189a2dfcb9fb6e01a5648e76850fe32bb5469efc192ae2a1427725dda062"} Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.693313 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt"] Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.694181 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7" event={"ID":"d3dc722f-f66c-46a0-9b1a-ae1b9c4de060","Type":"ContainerStarted","Data":"25b27b660f9d026738b89fe32f29d9d05c7c096d139f4969eccedc4a0ef1cdfc"} Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.695538 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" event={"ID":"70702cd5-6815-4a01-98a4-2f4dfaeef839","Type":"ContainerStarted","Data":"476911e8d76781fef68c6933ac928afce635c2f8a44b15d059a55811fc657a08"} Mar 21 04:41:01 crc kubenswrapper[4839]: W0321 04:41:01.699076 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5eeb53bd_3988_458f_baa5_d265e0178aea.slice/crio-6228dd5a7c9da87a461d25a02176c15280a17edf7ea5ded5dcdb2960fa9d19b4 WatchSource:0}: Error finding container 6228dd5a7c9da87a461d25a02176c15280a17edf7ea5ded5dcdb2960fa9d19b4: Status 404 returned error can't find the container with id 6228dd5a7c9da87a461d25a02176c15280a17edf7ea5ded5dcdb2960fa9d19b4 Mar 21 04:41:01 crc kubenswrapper[4839]: W0321 04:41:01.702501 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod379b40a1_e3f5_448b_b668_0f168457e5d0.slice/crio-720a95457a517f9f5d285d605be2ad29fffdce302247f1ef12e85bcc110782da WatchSource:0}: Error finding container 720a95457a517f9f5d285d605be2ad29fffdce302247f1ef12e85bcc110782da: Status 404 returned error can't find the container with id 720a95457a517f9f5d285d605be2ad29fffdce302247f1ef12e85bcc110782da Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.704803 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bdqxk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-qt58c_openstack-operators(379b40a1-e3f5-448b-b668-0f168457e5d0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.704797 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2hwvx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-7f4qh_openstack-operators(5eeb53bd-3988-458f-baa5-d265e0178aea): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.706500 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" podUID="5eeb53bd-3988-458f-baa5-d265e0178aea" Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.706526 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" podUID="379b40a1-e3f5-448b-b668-0f168457e5d0" Mar 21 04:41:01 crc kubenswrapper[4839]: W0321 04:41:01.709891 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3ea9c2e_11a4_492e_9e84_8294e81ce775.slice/crio-762ee9fc75ec394ff05b9aa66977c31cc1739cac21e982ba4b0a840666eae7f8 WatchSource:0}: Error finding container 762ee9fc75ec394ff05b9aa66977c31cc1739cac21e982ba4b0a840666eae7f8: Status 404 returned error can't find the container with id 762ee9fc75ec394ff05b9aa66977c31cc1739cac21e982ba4b0a840666eae7f8 Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.716414 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bx8b8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-btkvt_openstack-operators(d3ea9c2e-11a4-492e-9e84-8294e81ce775): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.717622 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" podUID="d3ea9c2e-11a4-492e-9e84-8294e81ce775" Mar 21 04:41:01 crc kubenswrapper[4839]: I0321 04:41:01.751078 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.751258 4839 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 04:41:01 crc kubenswrapper[4839]: E0321 04:41:01.751332 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert podName:859b11bc-e9fb-40a2-a053-66a07337965c nodeName:}" failed. No retries permitted until 2026-03-21 04:41:03.75131457 +0000 UTC m=+1068.079101246 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-8gc22" (UID: "859b11bc-e9fb-40a2-a053-66a07337965c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 04:41:02 crc kubenswrapper[4839]: I0321 04:41:02.256657 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:02 crc kubenswrapper[4839]: I0321 04:41:02.257076 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:02 crc kubenswrapper[4839]: E0321 04:41:02.257541 4839 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 04:41:02 crc kubenswrapper[4839]: E0321 04:41:02.257603 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs podName:06f9e67e-8978-46a1-9dc8-c511197241e2 nodeName:}" failed. No retries permitted until 2026-03-21 04:41:04.257590172 +0000 UTC m=+1068.585376848 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs") pod "openstack-operator-controller-manager-5ccd4855ff-jx6pn" (UID: "06f9e67e-8978-46a1-9dc8-c511197241e2") : secret "metrics-server-cert" not found Mar 21 04:41:02 crc kubenswrapper[4839]: E0321 04:41:02.258277 4839 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 04:41:02 crc kubenswrapper[4839]: E0321 04:41:02.259299 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs podName:06f9e67e-8978-46a1-9dc8-c511197241e2 nodeName:}" failed. No retries permitted until 2026-03-21 04:41:04.258356753 +0000 UTC m=+1068.586143429 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs") pod "openstack-operator-controller-manager-5ccd4855ff-jx6pn" (UID: "06f9e67e-8978-46a1-9dc8-c511197241e2") : secret "webhook-server-cert" not found Mar 21 04:41:02 crc kubenswrapper[4839]: I0321 04:41:02.716373 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn" event={"ID":"faac458b-73d9-4fb8-9f1c-50f7521088b0","Type":"ContainerStarted","Data":"1e35a256903f495719370ffe2429bd5fe5d7996f75d2d2e1db93599fc471e751"} Mar 21 04:41:02 crc kubenswrapper[4839]: I0321 04:41:02.720389 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" event={"ID":"379b40a1-e3f5-448b-b668-0f168457e5d0","Type":"ContainerStarted","Data":"720a95457a517f9f5d285d605be2ad29fffdce302247f1ef12e85bcc110782da"} Mar 21 04:41:02 crc kubenswrapper[4839]: E0321 04:41:02.722922 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" podUID="379b40a1-e3f5-448b-b668-0f168457e5d0" Mar 21 04:41:02 crc kubenswrapper[4839]: I0321 04:41:02.728221 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" event={"ID":"c8584ecb-dc92-4cec-9178-3017f09095da","Type":"ContainerStarted","Data":"e8410073c966fb866a357a85931b73911a01e1b42c609718749511a611851fdd"} Mar 21 04:41:02 crc kubenswrapper[4839]: E0321 04:41:02.732876 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" podUID="c8584ecb-dc92-4cec-9178-3017f09095da" Mar 21 04:41:02 crc kubenswrapper[4839]: I0321 04:41:02.741464 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt" event={"ID":"2045f5d2-c67e-47cd-b16d-3c69d449f099","Type":"ContainerStarted","Data":"ebc9a84925147293fca3aa5eef619abe213ed86caccf4379f911aad3558c281a"} Mar 21 04:41:02 crc kubenswrapper[4839]: I0321 04:41:02.743652 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" event={"ID":"5eeb53bd-3988-458f-baa5-d265e0178aea","Type":"ContainerStarted","Data":"6228dd5a7c9da87a461d25a02176c15280a17edf7ea5ded5dcdb2960fa9d19b4"} Mar 21 04:41:02 crc kubenswrapper[4839]: E0321 04:41:02.746579 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" podUID="5eeb53bd-3988-458f-baa5-d265e0178aea" Mar 21 04:41:02 crc kubenswrapper[4839]: I0321 04:41:02.763368 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" event={"ID":"d3ea9c2e-11a4-492e-9e84-8294e81ce775","Type":"ContainerStarted","Data":"762ee9fc75ec394ff05b9aa66977c31cc1739cac21e982ba4b0a840666eae7f8"} Mar 21 04:41:02 crc kubenswrapper[4839]: I0321 04:41:02.767907 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" event={"ID":"2162bafb-7e49-435c-9591-d8b725f10336","Type":"ContainerStarted","Data":"71e02a0a009b78d60b7181c44029ff9136a64c0fca88cf247d4175cc39a0e6c3"} Mar 21 04:41:02 crc kubenswrapper[4839]: E0321 04:41:02.773124 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" podUID="2162bafb-7e49-435c-9591-d8b725f10336" Mar 21 04:41:02 crc kubenswrapper[4839]: E0321 04:41:02.773193 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" podUID="d3ea9c2e-11a4-492e-9e84-8294e81ce775" Mar 21 04:41:02 crc kubenswrapper[4839]: I0321 04:41:02.773493 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" event={"ID":"1d32b541-7b80-492b-adac-e51d5090b668","Type":"ContainerStarted","Data":"5f3ab0be0781d2c50b7806355240791ed2d46f4185d1588c0117d51abce0987d"} Mar 21 04:41:03 crc kubenswrapper[4839]: I0321 04:41:03.474241 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:41:03 crc kubenswrapper[4839]: E0321 04:41:03.474617 4839 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 04:41:03 crc kubenswrapper[4839]: E0321 04:41:03.474675 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert podName:ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b nodeName:}" failed. No retries permitted until 2026-03-21 04:41:07.47466086 +0000 UTC m=+1071.802447536 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert") pod "infra-operator-controller-manager-7b9c774f96-bsdjs" (UID: "ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b") : secret "infra-operator-webhook-server-cert" not found Mar 21 04:41:03 crc kubenswrapper[4839]: I0321 04:41:03.779602 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:03 crc kubenswrapper[4839]: E0321 04:41:03.779880 4839 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 04:41:03 crc kubenswrapper[4839]: E0321 04:41:03.779925 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert podName:859b11bc-e9fb-40a2-a053-66a07337965c nodeName:}" failed. No retries permitted until 2026-03-21 04:41:07.77991076 +0000 UTC m=+1072.107697436 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-8gc22" (UID: "859b11bc-e9fb-40a2-a053-66a07337965c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 04:41:03 crc kubenswrapper[4839]: E0321 04:41:03.779915 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" podUID="2162bafb-7e49-435c-9591-d8b725f10336" Mar 21 04:41:03 crc kubenswrapper[4839]: E0321 04:41:03.780246 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" podUID="c8584ecb-dc92-4cec-9178-3017f09095da" Mar 21 04:41:03 crc kubenswrapper[4839]: E0321 04:41:03.780705 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" podUID="5eeb53bd-3988-458f-baa5-d265e0178aea" Mar 21 04:41:03 crc kubenswrapper[4839]: E0321 04:41:03.780797 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" podUID="d3ea9c2e-11a4-492e-9e84-8294e81ce775" Mar 21 04:41:03 crc kubenswrapper[4839]: E0321 04:41:03.781546 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" podUID="379b40a1-e3f5-448b-b668-0f168457e5d0" Mar 21 04:41:04 crc kubenswrapper[4839]: I0321 04:41:04.288478 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:04 crc kubenswrapper[4839]: I0321 04:41:04.288681 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:04 crc kubenswrapper[4839]: E0321 04:41:04.288853 4839 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 04:41:04 crc kubenswrapper[4839]: E0321 04:41:04.288900 4839 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 04:41:04 crc kubenswrapper[4839]: E0321 04:41:04.288933 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs podName:06f9e67e-8978-46a1-9dc8-c511197241e2 nodeName:}" failed. No retries permitted until 2026-03-21 04:41:08.288913359 +0000 UTC m=+1072.616700035 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs") pod "openstack-operator-controller-manager-5ccd4855ff-jx6pn" (UID: "06f9e67e-8978-46a1-9dc8-c511197241e2") : secret "metrics-server-cert" not found Mar 21 04:41:04 crc kubenswrapper[4839]: E0321 04:41:04.288972 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs podName:06f9e67e-8978-46a1-9dc8-c511197241e2 nodeName:}" failed. No retries permitted until 2026-03-21 04:41:08.28895013 +0000 UTC m=+1072.616736806 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs") pod "openstack-operator-controller-manager-5ccd4855ff-jx6pn" (UID: "06f9e67e-8978-46a1-9dc8-c511197241e2") : secret "webhook-server-cert" not found Mar 21 04:41:07 crc kubenswrapper[4839]: I0321 04:41:07.538381 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:41:07 crc kubenswrapper[4839]: E0321 04:41:07.538611 4839 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 21 04:41:07 crc kubenswrapper[4839]: E0321 04:41:07.539152 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert podName:ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b nodeName:}" failed. No retries permitted until 2026-03-21 04:41:15.539121365 +0000 UTC m=+1079.866908041 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert") pod "infra-operator-controller-manager-7b9c774f96-bsdjs" (UID: "ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b") : secret "infra-operator-webhook-server-cert" not found Mar 21 04:41:07 crc kubenswrapper[4839]: I0321 04:41:07.842667 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:07 crc kubenswrapper[4839]: E0321 04:41:07.842844 4839 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 04:41:07 crc kubenswrapper[4839]: E0321 04:41:07.842889 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert podName:859b11bc-e9fb-40a2-a053-66a07337965c nodeName:}" failed. No retries permitted until 2026-03-21 04:41:15.842876053 +0000 UTC m=+1080.170662729 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-8gc22" (UID: "859b11bc-e9fb-40a2-a053-66a07337965c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 21 04:41:08 crc kubenswrapper[4839]: I0321 04:41:08.349897 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:08 crc kubenswrapper[4839]: I0321 04:41:08.349973 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:08 crc kubenswrapper[4839]: E0321 04:41:08.350100 4839 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 21 04:41:08 crc kubenswrapper[4839]: E0321 04:41:08.350177 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs podName:06f9e67e-8978-46a1-9dc8-c511197241e2 nodeName:}" failed. No retries permitted until 2026-03-21 04:41:16.350157684 +0000 UTC m=+1080.677944440 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs") pod "openstack-operator-controller-manager-5ccd4855ff-jx6pn" (UID: "06f9e67e-8978-46a1-9dc8-c511197241e2") : secret "metrics-server-cert" not found Mar 21 04:41:08 crc kubenswrapper[4839]: E0321 04:41:08.350225 4839 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 21 04:41:08 crc kubenswrapper[4839]: E0321 04:41:08.350362 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs podName:06f9e67e-8978-46a1-9dc8-c511197241e2 nodeName:}" failed. No retries permitted until 2026-03-21 04:41:16.350309688 +0000 UTC m=+1080.678096454 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs") pod "openstack-operator-controller-manager-5ccd4855ff-jx6pn" (UID: "06f9e67e-8978-46a1-9dc8-c511197241e2") : secret "webhook-server-cert" not found Mar 21 04:41:15 crc kubenswrapper[4839]: I0321 04:41:15.612665 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:41:15 crc kubenswrapper[4839]: I0321 04:41:15.653147 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b-cert\") pod \"infra-operator-controller-manager-7b9c774f96-bsdjs\" (UID: \"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:41:15 crc kubenswrapper[4839]: I0321 04:41:15.662915 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:41:15 crc kubenswrapper[4839]: I0321 04:41:15.919541 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:15 crc kubenswrapper[4839]: I0321 04:41:15.924242 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/859b11bc-e9fb-40a2-a053-66a07337965c-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-8gc22\" (UID: \"859b11bc-e9fb-40a2-a053-66a07337965c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:16 crc kubenswrapper[4839]: I0321 04:41:16.015925 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:16 crc kubenswrapper[4839]: I0321 04:41:16.426531 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:16 crc kubenswrapper[4839]: I0321 04:41:16.427396 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:16 crc kubenswrapper[4839]: I0321 04:41:16.430725 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-metrics-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:16 crc kubenswrapper[4839]: I0321 04:41:16.431526 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/06f9e67e-8978-46a1-9dc8-c511197241e2-webhook-certs\") pod \"openstack-operator-controller-manager-5ccd4855ff-jx6pn\" (UID: \"06f9e67e-8978-46a1-9dc8-c511197241e2\") " pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:16 crc kubenswrapper[4839]: I0321 04:41:16.498111 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vcnt8" Mar 21 04:41:16 crc kubenswrapper[4839]: I0321 04:41:16.509803 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:18 crc kubenswrapper[4839]: E0321 04:41:18.925597 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807" Mar 21 04:41:18 crc kubenswrapper[4839]: E0321 04:41:18.926277 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bzgnq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-hh27s_openstack-operators(1d32b541-7b80-492b-adac-e51d5090b668): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:41:18 crc kubenswrapper[4839]: E0321 04:41:18.927491 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" podUID="1d32b541-7b80-492b-adac-e51d5090b668" Mar 21 04:41:19 crc kubenswrapper[4839]: E0321 04:41:19.509038 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" podUID="1d32b541-7b80-492b-adac-e51d5090b668" Mar 21 04:41:20 crc kubenswrapper[4839]: E0321 04:41:20.121523 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a" Mar 21 04:41:20 crc kubenswrapper[4839]: E0321 04:41:20.121715 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tcpnh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-767865f676-94vpf_openstack-operators(70702cd5-6815-4a01-98a4-2f4dfaeef839): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:41:20 crc kubenswrapper[4839]: E0321 04:41:20.123432 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" podUID="70702cd5-6815-4a01-98a4-2f4dfaeef839" Mar 21 04:41:21 crc kubenswrapper[4839]: E0321 04:41:21.114161 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" podUID="70702cd5-6815-4a01-98a4-2f4dfaeef839" Mar 21 04:41:22 crc kubenswrapper[4839]: E0321 04:41:22.289819 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 21 04:41:22 crc kubenswrapper[4839]: E0321 04:41:22.290017 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6648r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-wjw9j_openstack-operators(6914418f-3639-4ebc-a58d-d8b478cbf6b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:41:22 crc kubenswrapper[4839]: E0321 04:41:22.291184 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" podUID="6914418f-3639-4ebc-a58d-d8b478cbf6b4" Mar 21 04:41:22 crc kubenswrapper[4839]: E0321 04:41:22.804039 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777" Mar 21 04:41:22 crc kubenswrapper[4839]: E0321 04:41:22.804239 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5fztr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-8d58dc466-dncxc_openstack-operators(05f30a88-e899-4727-9440-981d010a1342): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:41:22 crc kubenswrapper[4839]: E0321 04:41:22.805464 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" podUID="05f30a88-e899-4727-9440-981d010a1342" Mar 21 04:41:23 crc kubenswrapper[4839]: E0321 04:41:23.128343 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" podUID="05f30a88-e899-4727-9440-981d010a1342" Mar 21 04:41:23 crc kubenswrapper[4839]: E0321 04:41:23.131513 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" podUID="6914418f-3639-4ebc-a58d-d8b478cbf6b4" Mar 21 04:41:23 crc kubenswrapper[4839]: E0321 04:41:23.415887 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 21 04:41:23 crc kubenswrapper[4839]: E0321 04:41:23.416086 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sdzgg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-k4lg5_openstack-operators(7a7bf7a3-acea-4059-8a89-db576f3588d1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:41:23 crc kubenswrapper[4839]: E0321 04:41:23.417263 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" podUID="7a7bf7a3-acea-4059-8a89-db576f3588d1" Mar 21 04:41:24 crc kubenswrapper[4839]: E0321 04:41:24.136637 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" podUID="7a7bf7a3-acea-4059-8a89-db576f3588d1" Mar 21 04:41:27 crc kubenswrapper[4839]: I0321 04:41:27.547638 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22"] Mar 21 04:41:27 crc kubenswrapper[4839]: I0321 04:41:27.626505 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs"] Mar 21 04:41:27 crc kubenswrapper[4839]: W0321 04:41:27.652474 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca2a8cd0_1c71_45bb_b4fc_4c7f82515b3b.slice/crio-b63fc0bf4c85f10ef31dbddf4ba6c783b58142632c6db0f8d741f682b33d05a5 WatchSource:0}: Error finding container b63fc0bf4c85f10ef31dbddf4ba6c783b58142632c6db0f8d741f682b33d05a5: Status 404 returned error can't find the container with id b63fc0bf4c85f10ef31dbddf4ba6c783b58142632c6db0f8d741f682b33d05a5 Mar 21 04:41:27 crc kubenswrapper[4839]: I0321 04:41:27.781549 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn"] Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.188815 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j" event={"ID":"6074766c-0ecd-4051-a676-dcc21b24184f","Type":"ContainerStarted","Data":"9a9f78cf0a14d5de0729d09dbdb5cf3e9768422484d405dc6ebae8447edfda36"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.189805 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.212888 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" event={"ID":"2162bafb-7e49-435c-9591-d8b725f10336","Type":"ContainerStarted","Data":"a103cedf709e0f1efd5371c0501a46a1069b5e2c9d2f96700d6110af18b60471"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.213730 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.220041 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" event={"ID":"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b","Type":"ContainerStarted","Data":"b63fc0bf4c85f10ef31dbddf4ba6c783b58142632c6db0f8d741f682b33d05a5"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.237817 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d" event={"ID":"ccec0d11-294b-43a2-be2e-fcef8a6818c6","Type":"ContainerStarted","Data":"41473f2c5d8c86b32cdc3c2a6d5e0e216ac9b1861edefe0e092ba5e6634ccefa"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.238602 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.261941 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt" event={"ID":"2045f5d2-c67e-47cd-b16d-3c69d449f099","Type":"ContainerStarted","Data":"67acf10ed6bac7057aee76136d969a692d945cbe0d46b587a7c4f9f547fe11f6"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.262772 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.277798 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j" podStartSLOduration=7.300646789 podStartE2EDuration="29.277778984s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.418690834 +0000 UTC m=+1065.746477500" lastFinishedPulling="2026-03-21 04:41:23.395823019 +0000 UTC m=+1087.723609695" observedRunningTime="2026-03-21 04:41:28.227745454 +0000 UTC m=+1092.555532130" watchObservedRunningTime="2026-03-21 04:41:28.277778984 +0000 UTC m=+1092.605565660" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.290693 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" podStartSLOduration=3.606030701 podStartE2EDuration="29.290670495s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.466641756 +0000 UTC m=+1065.794428432" lastFinishedPulling="2026-03-21 04:41:27.15128156 +0000 UTC m=+1091.479068226" observedRunningTime="2026-03-21 04:41:28.278822723 +0000 UTC m=+1092.606609399" watchObservedRunningTime="2026-03-21 04:41:28.290670495 +0000 UTC m=+1092.618457171" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.292875 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7" event={"ID":"d3dc722f-f66c-46a0-9b1a-ae1b9c4de060","Type":"ContainerStarted","Data":"634de7da255cd804a2e18816b291b64eb7d000a599f377038f3ddc0140788ad6"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.293666 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.305621 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" event={"ID":"06f9e67e-8978-46a1-9dc8-c511197241e2","Type":"ContainerStarted","Data":"55bdf6e62bbc1a37ec4b18a5aba81bffda0650b9566b70d88a865acf43d8d066"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.305676 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" event={"ID":"06f9e67e-8978-46a1-9dc8-c511197241e2","Type":"ContainerStarted","Data":"4a7a84c40d7546d7e282c94ae650b82aa400a266a6a30a4f3c9a726d568986c2"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.306397 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.332847 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d" podStartSLOduration=6.041056262 podStartE2EDuration="29.332827954s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.19903764 +0000 UTC m=+1065.526824316" lastFinishedPulling="2026-03-21 04:41:24.490809312 +0000 UTC m=+1088.818596008" observedRunningTime="2026-03-21 04:41:28.329949914 +0000 UTC m=+1092.657736590" watchObservedRunningTime="2026-03-21 04:41:28.332827954 +0000 UTC m=+1092.660614630" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.344897 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" event={"ID":"d3ea9c2e-11a4-492e-9e84-8294e81ce775","Type":"ContainerStarted","Data":"c35cc8d062eac13eebcfe68346660c95e96b656d2e63c934431da28d13763fc9"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.345621 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.368730 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn" event={"ID":"faac458b-73d9-4fb8-9f1c-50f7521088b0","Type":"ContainerStarted","Data":"711fedfa565003f3e8676b238fb5732b71d1179ad5a4f3ce139b01d353c78c5b"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.369373 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.377149 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" event={"ID":"c8584ecb-dc92-4cec-9178-3017f09095da","Type":"ContainerStarted","Data":"5225ba85403837e007c27f0908ad15aaa3a398465f58d263aa47458eb177b30a"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.382461 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt" podStartSLOduration=7.45296122 podStartE2EDuration="29.382438982s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.466233574 +0000 UTC m=+1065.794020250" lastFinishedPulling="2026-03-21 04:41:23.395711336 +0000 UTC m=+1087.723498012" observedRunningTime="2026-03-21 04:41:28.381905527 +0000 UTC m=+1092.709692193" watchObservedRunningTime="2026-03-21 04:41:28.382438982 +0000 UTC m=+1092.710225668" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.396787 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" event={"ID":"5eeb53bd-3988-458f-baa5-d265e0178aea","Type":"ContainerStarted","Data":"3243ecf3495ef3b03890baf0dd8f20305c3138a5be4bd45c487e00a1ea0e92ef"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.397448 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.410631 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz" event={"ID":"0c51ffa0-2285-4f7e-af09-0cafba139934","Type":"ContainerStarted","Data":"2c88204fa488213d4a01f36f6ead51e69c30b08d8733182a70d861da1c63aa67"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.411392 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.418897 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d" event={"ID":"fd731e7e-440b-4e77-a778-08a4a62e0c9f","Type":"ContainerStarted","Data":"f6d7298b89265ea0b0e32733b5171ef0302b5814bb0a0b1807e3695a32147290"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.419690 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.434467 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-x75fd" event={"ID":"361c2d7b-9a75-41fd-953d-4b1bd64ca6df","Type":"ContainerStarted","Data":"399588d5878e8f33a9fe43bf1f68cd960748e5bc08fc984a42824e5d44050afe"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.435126 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-x75fd" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.475065 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt" event={"ID":"ee9d64a7-0d03-4cb0-a266-47b26f9957b5","Type":"ContainerStarted","Data":"020e99262e5422112c1e7f4291e1e1bb460871d7f1abd0108bc960753b39b0bb"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.475104 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" event={"ID":"859b11bc-e9fb-40a2-a053-66a07337965c","Type":"ContainerStarted","Data":"e0600411406dd7042d97e44d13c8fe878d34e3cf93a42f82842f96fe3401748c"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.475122 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.479864 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r" event={"ID":"acb1d7ac-b3f9-4564-8346-344ffb5c3964","Type":"ContainerStarted","Data":"fa4799519581ae9bbfb1067af15231c49e45fbcc764ac42ff1560afc251a03c0"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.480264 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.488890 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" event={"ID":"379b40a1-e3f5-448b-b668-0f168457e5d0","Type":"ContainerStarted","Data":"0efd5f828c3c703c7d4cfecb83afa6f34b7fb100ae7c48ed515b9e78bae52115"} Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.489706 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.495916 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn" podStartSLOduration=7.556168027 podStartE2EDuration="29.495894196s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.463191089 +0000 UTC m=+1065.790977765" lastFinishedPulling="2026-03-21 04:41:23.402917258 +0000 UTC m=+1087.730703934" observedRunningTime="2026-03-21 04:41:28.495753552 +0000 UTC m=+1092.823540228" watchObservedRunningTime="2026-03-21 04:41:28.495894196 +0000 UTC m=+1092.823680892" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.496998 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7" podStartSLOduration=6.048641534 podStartE2EDuration="29.496987807s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.042700696 +0000 UTC m=+1065.370487372" lastFinishedPulling="2026-03-21 04:41:24.491046949 +0000 UTC m=+1088.818833645" observedRunningTime="2026-03-21 04:41:28.448805659 +0000 UTC m=+1092.776592335" watchObservedRunningTime="2026-03-21 04:41:28.496987807 +0000 UTC m=+1092.824774493" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.528057 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" podStartSLOduration=3.9756173390000002 podStartE2EDuration="29.528038025s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.716281049 +0000 UTC m=+1066.044067735" lastFinishedPulling="2026-03-21 04:41:27.268701745 +0000 UTC m=+1091.596488421" observedRunningTime="2026-03-21 04:41:28.526117962 +0000 UTC m=+1092.853904638" watchObservedRunningTime="2026-03-21 04:41:28.528038025 +0000 UTC m=+1092.855824701" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.614304 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" podStartSLOduration=28.614285558 podStartE2EDuration="28.614285558s" podCreationTimestamp="2026-03-21 04:41:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:41:28.61292741 +0000 UTC m=+1092.940714096" watchObservedRunningTime="2026-03-21 04:41:28.614285558 +0000 UTC m=+1092.942072234" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.617077 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-lzbtt" podStartSLOduration=2.9663166199999997 podStartE2EDuration="28.617059586s" podCreationTimestamp="2026-03-21 04:41:00 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.681736903 +0000 UTC m=+1066.009523579" lastFinishedPulling="2026-03-21 04:41:27.332479869 +0000 UTC m=+1091.660266545" observedRunningTime="2026-03-21 04:41:28.557905901 +0000 UTC m=+1092.885692577" watchObservedRunningTime="2026-03-21 04:41:28.617059586 +0000 UTC m=+1092.944846262" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.655720 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt" podStartSLOduration=6.001018561 podStartE2EDuration="29.655697417s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:00.835996883 +0000 UTC m=+1065.163783559" lastFinishedPulling="2026-03-21 04:41:24.490675739 +0000 UTC m=+1088.818462415" observedRunningTime="2026-03-21 04:41:28.648405673 +0000 UTC m=+1092.976192349" watchObservedRunningTime="2026-03-21 04:41:28.655697417 +0000 UTC m=+1092.983484093" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.684133 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" podStartSLOduration=4.121099691 podStartE2EDuration="29.684115112s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.704634974 +0000 UTC m=+1066.032421640" lastFinishedPulling="2026-03-21 04:41:27.267650385 +0000 UTC m=+1091.595437061" observedRunningTime="2026-03-21 04:41:28.680227673 +0000 UTC m=+1093.008014359" watchObservedRunningTime="2026-03-21 04:41:28.684115112 +0000 UTC m=+1093.011901788" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.722024 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" podStartSLOduration=4.264109832 podStartE2EDuration="29.722002732s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.704640824 +0000 UTC m=+1066.032427500" lastFinishedPulling="2026-03-21 04:41:27.162533724 +0000 UTC m=+1091.490320400" observedRunningTime="2026-03-21 04:41:28.713917295 +0000 UTC m=+1093.041703981" watchObservedRunningTime="2026-03-21 04:41:28.722002732 +0000 UTC m=+1093.049789408" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.749700 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-x75fd" podStartSLOduration=6.6804999 podStartE2EDuration="29.749675886s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.422078609 +0000 UTC m=+1065.749865285" lastFinishedPulling="2026-03-21 04:41:24.491254585 +0000 UTC m=+1088.819041271" observedRunningTime="2026-03-21 04:41:28.748819262 +0000 UTC m=+1093.076605938" watchObservedRunningTime="2026-03-21 04:41:28.749675886 +0000 UTC m=+1093.077462572" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.776106 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d" podStartSLOduration=6.373442051 podStartE2EDuration="29.776090725s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.088335993 +0000 UTC m=+1065.416122669" lastFinishedPulling="2026-03-21 04:41:24.490984647 +0000 UTC m=+1088.818771343" observedRunningTime="2026-03-21 04:41:28.774461509 +0000 UTC m=+1093.102248205" watchObservedRunningTime="2026-03-21 04:41:28.776090725 +0000 UTC m=+1093.103877401" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.808654 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r" podStartSLOduration=7.38825609 podStartE2EDuration="29.808636635s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:00.982540823 +0000 UTC m=+1065.310327499" lastFinishedPulling="2026-03-21 04:41:23.402921368 +0000 UTC m=+1087.730708044" observedRunningTime="2026-03-21 04:41:28.80344031 +0000 UTC m=+1093.131226986" watchObservedRunningTime="2026-03-21 04:41:28.808636635 +0000 UTC m=+1093.136423311" Mar 21 04:41:28 crc kubenswrapper[4839]: I0321 04:41:28.832843 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz" podStartSLOduration=7.272967195 podStartE2EDuration="29.832824932s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:00.835998223 +0000 UTC m=+1065.163784899" lastFinishedPulling="2026-03-21 04:41:23.39585596 +0000 UTC m=+1087.723642636" observedRunningTime="2026-03-21 04:41:28.8309806 +0000 UTC m=+1093.158767276" watchObservedRunningTime="2026-03-21 04:41:28.832824932 +0000 UTC m=+1093.160611608" Mar 21 04:41:30 crc kubenswrapper[4839]: I0321 04:41:30.979874 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:41:30 crc kubenswrapper[4839]: I0321 04:41:30.980258 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:41:32 crc kubenswrapper[4839]: I0321 04:41:32.527069 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" event={"ID":"1d32b541-7b80-492b-adac-e51d5090b668","Type":"ContainerStarted","Data":"d6f1629e94084e4cd740063bed823aafe4e138a7b52af9e308c678da34c08dab"} Mar 21 04:41:32 crc kubenswrapper[4839]: I0321 04:41:32.527642 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" Mar 21 04:41:32 crc kubenswrapper[4839]: I0321 04:41:32.528990 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" event={"ID":"ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b","Type":"ContainerStarted","Data":"f8d051f6645b73a056ba3e02d18242fa3c2872a559ea6446dedc430614cafbd7"} Mar 21 04:41:32 crc kubenswrapper[4839]: I0321 04:41:32.529118 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:41:32 crc kubenswrapper[4839]: I0321 04:41:32.530877 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" event={"ID":"859b11bc-e9fb-40a2-a053-66a07337965c","Type":"ContainerStarted","Data":"c8487d32c5de78d29c1475c53de91c166771fbb61a68ec96854bfb0de661a1b1"} Mar 21 04:41:32 crc kubenswrapper[4839]: I0321 04:41:32.531050 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:32 crc kubenswrapper[4839]: I0321 04:41:32.549859 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" podStartSLOduration=2.554344764 podStartE2EDuration="32.549839766s" podCreationTimestamp="2026-03-21 04:41:00 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.68163365 +0000 UTC m=+1066.009420326" lastFinishedPulling="2026-03-21 04:41:31.677128632 +0000 UTC m=+1096.004915328" observedRunningTime="2026-03-21 04:41:32.542535072 +0000 UTC m=+1096.870321768" watchObservedRunningTime="2026-03-21 04:41:32.549839766 +0000 UTC m=+1096.877626442" Mar 21 04:41:32 crc kubenswrapper[4839]: I0321 04:41:32.555513 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" podStartSLOduration=29.534562028 podStartE2EDuration="33.555488875s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:27.65496668 +0000 UTC m=+1091.982753356" lastFinishedPulling="2026-03-21 04:41:31.675893527 +0000 UTC m=+1096.003680203" observedRunningTime="2026-03-21 04:41:32.555463034 +0000 UTC m=+1096.883249730" watchObservedRunningTime="2026-03-21 04:41:32.555488875 +0000 UTC m=+1096.883275551" Mar 21 04:41:32 crc kubenswrapper[4839]: I0321 04:41:32.588445 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" podStartSLOduration=29.476132373 podStartE2EDuration="33.588428546s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:27.564829259 +0000 UTC m=+1091.892615935" lastFinishedPulling="2026-03-21 04:41:31.677125422 +0000 UTC m=+1096.004912108" observedRunningTime="2026-03-21 04:41:32.58785887 +0000 UTC m=+1096.915645556" watchObservedRunningTime="2026-03-21 04:41:32.588428546 +0000 UTC m=+1096.916215222" Mar 21 04:41:34 crc kubenswrapper[4839]: I0321 04:41:34.546553 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" event={"ID":"70702cd5-6815-4a01-98a4-2f4dfaeef839","Type":"ContainerStarted","Data":"b58db27d2d95e030329747eea38da9c271f47fb2d475f66104b3ccb3429a2c3c"} Mar 21 04:41:34 crc kubenswrapper[4839]: I0321 04:41:34.547350 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" Mar 21 04:41:34 crc kubenswrapper[4839]: I0321 04:41:34.562533 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" podStartSLOduration=3.050101138 podStartE2EDuration="35.562517421s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.345460486 +0000 UTC m=+1065.673247162" lastFinishedPulling="2026-03-21 04:41:33.857876739 +0000 UTC m=+1098.185663445" observedRunningTime="2026-03-21 04:41:34.559422134 +0000 UTC m=+1098.887208810" watchObservedRunningTime="2026-03-21 04:41:34.562517421 +0000 UTC m=+1098.890304097" Mar 21 04:41:35 crc kubenswrapper[4839]: I0321 04:41:35.556074 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" event={"ID":"05f30a88-e899-4727-9440-981d010a1342","Type":"ContainerStarted","Data":"a1f568e57e5085c011ddf0acbcf6148bbd55263b2b00d976af4bff35fd113311"} Mar 21 04:41:35 crc kubenswrapper[4839]: I0321 04:41:35.556578 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" Mar 21 04:41:35 crc kubenswrapper[4839]: I0321 04:41:35.577319 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" podStartSLOduration=2.340336302 podStartE2EDuration="36.577296579s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:00.644686391 +0000 UTC m=+1064.972473067" lastFinishedPulling="2026-03-21 04:41:34.881646668 +0000 UTC m=+1099.209433344" observedRunningTime="2026-03-21 04:41:35.572903616 +0000 UTC m=+1099.900690312" watchObservedRunningTime="2026-03-21 04:41:35.577296579 +0000 UTC m=+1099.905083255" Mar 21 04:41:36 crc kubenswrapper[4839]: I0321 04:41:36.026076 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-8gc22" Mar 21 04:41:36 crc kubenswrapper[4839]: I0321 04:41:36.515126 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5ccd4855ff-jx6pn" Mar 21 04:41:36 crc kubenswrapper[4839]: I0321 04:41:36.575317 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" event={"ID":"7a7bf7a3-acea-4059-8a89-db576f3588d1","Type":"ContainerStarted","Data":"f0607583941d87b541d87529b02738cfb39b0b3cd9d1173a5fa1a97050d8e31d"} Mar 21 04:41:36 crc kubenswrapper[4839]: I0321 04:41:36.576020 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" Mar 21 04:41:39 crc kubenswrapper[4839]: I0321 04:41:39.593174 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" event={"ID":"6914418f-3639-4ebc-a58d-d8b478cbf6b4","Type":"ContainerStarted","Data":"a0218d33682d53e141602fa8319c3ab211da923160ce7bbabde66b108e30e250"} Mar 21 04:41:39 crc kubenswrapper[4839]: I0321 04:41:39.594083 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" Mar 21 04:41:39 crc kubenswrapper[4839]: I0321 04:41:39.607045 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" podStartSLOduration=5.795175313 podStartE2EDuration="40.607027332s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.177558569 +0000 UTC m=+1065.505345245" lastFinishedPulling="2026-03-21 04:41:35.989410578 +0000 UTC m=+1100.317197264" observedRunningTime="2026-03-21 04:41:36.603710124 +0000 UTC m=+1100.931496800" watchObservedRunningTime="2026-03-21 04:41:39.607027332 +0000 UTC m=+1103.934814008" Mar 21 04:41:39 crc kubenswrapper[4839]: I0321 04:41:39.607426 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" podStartSLOduration=3.3295372260000002 podStartE2EDuration="40.607420873s" podCreationTimestamp="2026-03-21 04:40:59 +0000 UTC" firstStartedPulling="2026-03-21 04:41:01.024783645 +0000 UTC m=+1065.352570321" lastFinishedPulling="2026-03-21 04:41:38.302667282 +0000 UTC m=+1102.630453968" observedRunningTime="2026-03-21 04:41:39.606676622 +0000 UTC m=+1103.934463298" watchObservedRunningTime="2026-03-21 04:41:39.607420873 +0000 UTC m=+1103.935207549" Mar 21 04:41:39 crc kubenswrapper[4839]: I0321 04:41:39.777475 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-2mkmz" Mar 21 04:41:39 crc kubenswrapper[4839]: I0321 04:41:39.824756 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-9s4vt" Mar 21 04:41:39 crc kubenswrapper[4839]: I0321 04:41:39.866124 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-6s6q7" Mar 21 04:41:39 crc kubenswrapper[4839]: I0321 04:41:39.872115 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-2n27d" Mar 21 04:41:39 crc kubenswrapper[4839]: I0321 04:41:39.940098 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-d7h7r" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.048100 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8sg4d" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.201366 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gzh8j" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.228130 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-94vpf" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.270520 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-sp4j4" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.368727 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-6p4mn" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.444406 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qt58c" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.528957 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-x75fd" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.549102 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xt7xt" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.795042 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-btkvt" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.824147 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-7f4qh" Mar 21 04:41:40 crc kubenswrapper[4839]: I0321 04:41:40.827810 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hh27s" Mar 21 04:41:45 crc kubenswrapper[4839]: I0321 04:41:45.668628 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-bsdjs" Mar 21 04:41:49 crc kubenswrapper[4839]: I0321 04:41:49.787010 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-dncxc" Mar 21 04:41:50 crc kubenswrapper[4839]: I0321 04:41:50.007276 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wjw9j" Mar 21 04:41:50 crc kubenswrapper[4839]: I0321 04:41:50.178478 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k4lg5" Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.161612 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567802-zsmks"] Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.163034 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567802-zsmks" Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.166809 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.167184 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.167742 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.171122 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567802-zsmks"] Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.224696 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktvvb\" (UniqueName: \"kubernetes.io/projected/ab3902e0-a483-447f-b86c-4fe8e8983152-kube-api-access-ktvvb\") pod \"auto-csr-approver-29567802-zsmks\" (UID: \"ab3902e0-a483-447f-b86c-4fe8e8983152\") " pod="openshift-infra/auto-csr-approver-29567802-zsmks" Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.326080 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktvvb\" (UniqueName: \"kubernetes.io/projected/ab3902e0-a483-447f-b86c-4fe8e8983152-kube-api-access-ktvvb\") pod \"auto-csr-approver-29567802-zsmks\" (UID: \"ab3902e0-a483-447f-b86c-4fe8e8983152\") " pod="openshift-infra/auto-csr-approver-29567802-zsmks" Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.343825 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktvvb\" (UniqueName: \"kubernetes.io/projected/ab3902e0-a483-447f-b86c-4fe8e8983152-kube-api-access-ktvvb\") pod \"auto-csr-approver-29567802-zsmks\" (UID: \"ab3902e0-a483-447f-b86c-4fe8e8983152\") " pod="openshift-infra/auto-csr-approver-29567802-zsmks" Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.482445 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567802-zsmks" Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.879780 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567802-zsmks"] Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.980373 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:42:00 crc kubenswrapper[4839]: I0321 04:42:00.980438 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:42:01 crc kubenswrapper[4839]: I0321 04:42:01.744917 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567802-zsmks" event={"ID":"ab3902e0-a483-447f-b86c-4fe8e8983152","Type":"ContainerStarted","Data":"cf16797bf9f83a86eb511379fcad611d80959bf312445466bab1c78ca2ba1616"} Mar 21 04:42:04 crc kubenswrapper[4839]: I0321 04:42:04.769842 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567802-zsmks" event={"ID":"ab3902e0-a483-447f-b86c-4fe8e8983152","Type":"ContainerStarted","Data":"13a62d6a43116fe61b0ca05db07b93400dd1b1d3d2760f545556037c6e4992fd"} Mar 21 04:42:04 crc kubenswrapper[4839]: I0321 04:42:04.782888 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567802-zsmks" podStartSLOduration=1.362470622 podStartE2EDuration="4.782871449s" podCreationTimestamp="2026-03-21 04:42:00 +0000 UTC" firstStartedPulling="2026-03-21 04:42:00.893745789 +0000 UTC m=+1125.221532465" lastFinishedPulling="2026-03-21 04:42:04.314146616 +0000 UTC m=+1128.641933292" observedRunningTime="2026-03-21 04:42:04.78074747 +0000 UTC m=+1129.108534156" watchObservedRunningTime="2026-03-21 04:42:04.782871449 +0000 UTC m=+1129.110658125" Mar 21 04:42:05 crc kubenswrapper[4839]: I0321 04:42:05.777980 4839 generic.go:334] "Generic (PLEG): container finished" podID="ab3902e0-a483-447f-b86c-4fe8e8983152" containerID="13a62d6a43116fe61b0ca05db07b93400dd1b1d3d2760f545556037c6e4992fd" exitCode=0 Mar 21 04:42:05 crc kubenswrapper[4839]: I0321 04:42:05.778030 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567802-zsmks" event={"ID":"ab3902e0-a483-447f-b86c-4fe8e8983152","Type":"ContainerDied","Data":"13a62d6a43116fe61b0ca05db07b93400dd1b1d3d2760f545556037c6e4992fd"} Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.057322 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567802-zsmks" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.126745 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktvvb\" (UniqueName: \"kubernetes.io/projected/ab3902e0-a483-447f-b86c-4fe8e8983152-kube-api-access-ktvvb\") pod \"ab3902e0-a483-447f-b86c-4fe8e8983152\" (UID: \"ab3902e0-a483-447f-b86c-4fe8e8983152\") " Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.134854 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab3902e0-a483-447f-b86c-4fe8e8983152-kube-api-access-ktvvb" (OuterVolumeSpecName: "kube-api-access-ktvvb") pod "ab3902e0-a483-447f-b86c-4fe8e8983152" (UID: "ab3902e0-a483-447f-b86c-4fe8e8983152"). InnerVolumeSpecName "kube-api-access-ktvvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.227961 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktvvb\" (UniqueName: \"kubernetes.io/projected/ab3902e0-a483-447f-b86c-4fe8e8983152-kube-api-access-ktvvb\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.413743 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zn278"] Mar 21 04:42:07 crc kubenswrapper[4839]: E0321 04:42:07.414427 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab3902e0-a483-447f-b86c-4fe8e8983152" containerName="oc" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.414449 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab3902e0-a483-447f-b86c-4fe8e8983152" containerName="oc" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.414688 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab3902e0-a483-447f-b86c-4fe8e8983152" containerName="oc" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.415558 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.417257 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-8nc52" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.417435 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.417546 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.419240 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.434858 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef534e12-a75b-4e9b-b2a3-4046bece5903-config\") pod \"dnsmasq-dns-675f4bcbfc-zn278\" (UID: \"ef534e12-a75b-4e9b-b2a3-4046bece5903\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.435196 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkmxw\" (UniqueName: \"kubernetes.io/projected/ef534e12-a75b-4e9b-b2a3-4046bece5903-kube-api-access-vkmxw\") pod \"dnsmasq-dns-675f4bcbfc-zn278\" (UID: \"ef534e12-a75b-4e9b-b2a3-4046bece5903\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.446224 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zn278"] Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.487703 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7tj8n"] Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.489142 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.492041 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.501802 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7tj8n"] Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.538321 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef534e12-a75b-4e9b-b2a3-4046bece5903-config\") pod \"dnsmasq-dns-675f4bcbfc-zn278\" (UID: \"ef534e12-a75b-4e9b-b2a3-4046bece5903\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.538422 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkmxw\" (UniqueName: \"kubernetes.io/projected/ef534e12-a75b-4e9b-b2a3-4046bece5903-kube-api-access-vkmxw\") pod \"dnsmasq-dns-675f4bcbfc-zn278\" (UID: \"ef534e12-a75b-4e9b-b2a3-4046bece5903\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.539656 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef534e12-a75b-4e9b-b2a3-4046bece5903-config\") pod \"dnsmasq-dns-675f4bcbfc-zn278\" (UID: \"ef534e12-a75b-4e9b-b2a3-4046bece5903\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.578623 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkmxw\" (UniqueName: \"kubernetes.io/projected/ef534e12-a75b-4e9b-b2a3-4046bece5903-kube-api-access-vkmxw\") pod \"dnsmasq-dns-675f4bcbfc-zn278\" (UID: \"ef534e12-a75b-4e9b-b2a3-4046bece5903\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.639420 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7tj8n\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.639765 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-config\") pod \"dnsmasq-dns-78dd6ddcc-7tj8n\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.639932 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grztj\" (UniqueName: \"kubernetes.io/projected/2f135620-f512-4bfe-9875-4d4e07c6a0f5-kube-api-access-grztj\") pod \"dnsmasq-dns-78dd6ddcc-7tj8n\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.740987 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-config\") pod \"dnsmasq-dns-78dd6ddcc-7tj8n\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.741038 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grztj\" (UniqueName: \"kubernetes.io/projected/2f135620-f512-4bfe-9875-4d4e07c6a0f5-kube-api-access-grztj\") pod \"dnsmasq-dns-78dd6ddcc-7tj8n\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.741098 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7tj8n\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.741823 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-7tj8n\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.742323 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-config\") pod \"dnsmasq-dns-78dd6ddcc-7tj8n\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.752996 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.758360 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grztj\" (UniqueName: \"kubernetes.io/projected/2f135620-f512-4bfe-9875-4d4e07c6a0f5-kube-api-access-grztj\") pod \"dnsmasq-dns-78dd6ddcc-7tj8n\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.791366 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567802-zsmks" event={"ID":"ab3902e0-a483-447f-b86c-4fe8e8983152","Type":"ContainerDied","Data":"cf16797bf9f83a86eb511379fcad611d80959bf312445466bab1c78ca2ba1616"} Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.791632 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf16797bf9f83a86eb511379fcad611d80959bf312445466bab1c78ca2ba1616" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.791755 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567802-zsmks" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.813990 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.855763 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567796-c5w5j"] Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.868438 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567796-c5w5j"] Mar 21 04:42:07 crc kubenswrapper[4839]: I0321 04:42:07.986928 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zn278"] Mar 21 04:42:07 crc kubenswrapper[4839]: W0321 04:42:07.990761 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef534e12_a75b_4e9b_b2a3_4046bece5903.slice/crio-8d9c14a410c4252b3bc652d39bc6ba19cb56d790ccb813a314300ee40264a16c WatchSource:0}: Error finding container 8d9c14a410c4252b3bc652d39bc6ba19cb56d790ccb813a314300ee40264a16c: Status 404 returned error can't find the container with id 8d9c14a410c4252b3bc652d39bc6ba19cb56d790ccb813a314300ee40264a16c Mar 21 04:42:08 crc kubenswrapper[4839]: I0321 04:42:08.256986 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7tj8n"] Mar 21 04:42:08 crc kubenswrapper[4839]: I0321 04:42:08.462536 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b" path="/var/lib/kubelet/pods/c0a0d1c7-7c95-46aa-81de-f2a3d91bea6b/volumes" Mar 21 04:42:08 crc kubenswrapper[4839]: I0321 04:42:08.800551 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" event={"ID":"2f135620-f512-4bfe-9875-4d4e07c6a0f5","Type":"ContainerStarted","Data":"2be9891bd895defb2ca0d28836ebf421c8763119934d2eefdf49a40b80778096"} Mar 21 04:42:08 crc kubenswrapper[4839]: I0321 04:42:08.802459 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" event={"ID":"ef534e12-a75b-4e9b-b2a3-4046bece5903","Type":"ContainerStarted","Data":"8d9c14a410c4252b3bc652d39bc6ba19cb56d790ccb813a314300ee40264a16c"} Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.319885 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zn278"] Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.348979 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-d99qf"] Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.354806 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.373063 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-d99qf"] Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.484951 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-d99qf\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.485224 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-config\") pod \"dnsmasq-dns-5ccc8479f9-d99qf\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.485277 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djkvq\" (UniqueName: \"kubernetes.io/projected/45f7903c-4081-446b-94d9-ad979332590b-kube-api-access-djkvq\") pod \"dnsmasq-dns-5ccc8479f9-d99qf\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.586290 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-d99qf\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.586349 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-config\") pod \"dnsmasq-dns-5ccc8479f9-d99qf\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.586393 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djkvq\" (UniqueName: \"kubernetes.io/projected/45f7903c-4081-446b-94d9-ad979332590b-kube-api-access-djkvq\") pod \"dnsmasq-dns-5ccc8479f9-d99qf\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.589070 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-config\") pod \"dnsmasq-dns-5ccc8479f9-d99qf\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.589620 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-d99qf\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.611014 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djkvq\" (UniqueName: \"kubernetes.io/projected/45f7903c-4081-446b-94d9-ad979332590b-kube-api-access-djkvq\") pod \"dnsmasq-dns-5ccc8479f9-d99qf\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.623959 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7tj8n"] Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.667457 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hpzj8"] Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.668672 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.673508 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hpzj8"] Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.684913 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.791763 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cxnx\" (UniqueName: \"kubernetes.io/projected/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-kube-api-access-6cxnx\") pod \"dnsmasq-dns-57d769cc4f-hpzj8\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.791873 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-config\") pod \"dnsmasq-dns-57d769cc4f-hpzj8\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.791946 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hpzj8\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.894814 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-config\") pod \"dnsmasq-dns-57d769cc4f-hpzj8\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.894918 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hpzj8\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.894976 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cxnx\" (UniqueName: \"kubernetes.io/projected/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-kube-api-access-6cxnx\") pod \"dnsmasq-dns-57d769cc4f-hpzj8\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.895981 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-config\") pod \"dnsmasq-dns-57d769cc4f-hpzj8\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.896029 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hpzj8\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.914473 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cxnx\" (UniqueName: \"kubernetes.io/projected/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-kube-api-access-6cxnx\") pod \"dnsmasq-dns-57d769cc4f-hpzj8\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:10 crc kubenswrapper[4839]: I0321 04:42:10.985995 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.521128 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.523104 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.525115 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.525972 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.526247 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.526534 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.527106 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.527259 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.527477 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-wq8rw" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.527658 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709491 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709534 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709590 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709719 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709755 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb4vz\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-kube-api-access-rb4vz\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709791 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709848 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709885 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709940 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709968 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.709995 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.799938 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.801304 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.806367 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.806545 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.807544 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-nxhtb" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.807726 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.807966 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.808247 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.808498 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.810781 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.810837 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.810873 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.810903 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.810932 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.810984 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.811005 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.811034 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.811072 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.811095 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb4vz\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-kube-api-access-rb4vz\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.811132 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.812150 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.812405 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.812468 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.812821 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.816018 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.816192 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.816363 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.818195 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.827633 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.835276 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb4vz\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-kube-api-access-rb4vz\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.836425 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.843497 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.844026 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.847900 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.912986 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.913029 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh2cd\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-kube-api-access-vh2cd\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.913073 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-config-data\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.913091 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.913153 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.913220 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8028561c-b039-4400-a065-b5efee753b5f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.913256 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.913332 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.913438 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.913481 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8028561c-b039-4400-a065-b5efee753b5f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:11 crc kubenswrapper[4839]: I0321 04:42:11.913507 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015016 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015071 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8028561c-b039-4400-a065-b5efee753b5f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015097 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015180 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015208 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh2cd\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-kube-api-access-vh2cd\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015238 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-config-data\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015261 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015284 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015308 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8028561c-b039-4400-a065-b5efee753b5f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015332 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015368 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.015738 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.019388 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-config-data\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.019754 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.019905 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.020865 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.023937 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.024261 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.031300 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8028561c-b039-4400-a065-b5efee753b5f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.032272 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8028561c-b039-4400-a065-b5efee753b5f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.037791 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.063279 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.073650 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh2cd\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-kube-api-access-vh2cd\") pod \"rabbitmq-server-0\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " pod="openstack/rabbitmq-server-0" Mar 21 04:42:12 crc kubenswrapper[4839]: I0321 04:42:12.197671 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.143858 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.146482 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.152706 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.152783 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-p2lmh" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.152890 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.152970 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.153010 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.157683 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.236273 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4f1edf0d-f220-4815-aeb6-e4507576247a-config-data-default\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.236467 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f1edf0d-f220-4815-aeb6-e4507576247a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.236500 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv6bv\" (UniqueName: \"kubernetes.io/projected/4f1edf0d-f220-4815-aeb6-e4507576247a-kube-api-access-qv6bv\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.236539 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f1edf0d-f220-4815-aeb6-e4507576247a-kolla-config\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.236682 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.236708 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f1edf0d-f220-4815-aeb6-e4507576247a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.236771 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4f1edf0d-f220-4815-aeb6-e4507576247a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.236880 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1edf0d-f220-4815-aeb6-e4507576247a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.338041 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1edf0d-f220-4815-aeb6-e4507576247a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.338096 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4f1edf0d-f220-4815-aeb6-e4507576247a-config-data-default\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.338112 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f1edf0d-f220-4815-aeb6-e4507576247a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.338130 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv6bv\" (UniqueName: \"kubernetes.io/projected/4f1edf0d-f220-4815-aeb6-e4507576247a-kube-api-access-qv6bv\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.338156 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f1edf0d-f220-4815-aeb6-e4507576247a-kolla-config\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.338198 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.338215 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f1edf0d-f220-4815-aeb6-e4507576247a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.338236 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4f1edf0d-f220-4815-aeb6-e4507576247a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.338588 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.338718 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4f1edf0d-f220-4815-aeb6-e4507576247a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.339448 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f1edf0d-f220-4815-aeb6-e4507576247a-kolla-config\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.339591 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4f1edf0d-f220-4815-aeb6-e4507576247a-config-data-default\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.340061 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f1edf0d-f220-4815-aeb6-e4507576247a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.346212 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1edf0d-f220-4815-aeb6-e4507576247a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.347665 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f1edf0d-f220-4815-aeb6-e4507576247a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.366556 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv6bv\" (UniqueName: \"kubernetes.io/projected/4f1edf0d-f220-4815-aeb6-e4507576247a-kube-api-access-qv6bv\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.368875 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"4f1edf0d-f220-4815-aeb6-e4507576247a\") " pod="openstack/openstack-galera-0" Mar 21 04:42:13 crc kubenswrapper[4839]: I0321 04:42:13.466367 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.418676 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.419777 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.424888 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-shglw" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.430148 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.431991 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.432275 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.449832 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.558525 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.559813 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30d22e92-45bd-4d1e-954e-3ade801245d4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.559877 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30d22e92-45bd-4d1e-954e-3ade801245d4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.560078 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30d22e92-45bd-4d1e-954e-3ade801245d4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.560142 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw629\" (UniqueName: \"kubernetes.io/projected/30d22e92-45bd-4d1e-954e-3ade801245d4-kube-api-access-vw629\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.560177 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30d22e92-45bd-4d1e-954e-3ade801245d4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.560213 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30d22e92-45bd-4d1e-954e-3ade801245d4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.562962 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d22e92-45bd-4d1e-954e-3ade801245d4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.664894 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30d22e92-45bd-4d1e-954e-3ade801245d4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.664972 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw629\" (UniqueName: \"kubernetes.io/projected/30d22e92-45bd-4d1e-954e-3ade801245d4-kube-api-access-vw629\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.664998 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30d22e92-45bd-4d1e-954e-3ade801245d4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.665022 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30d22e92-45bd-4d1e-954e-3ade801245d4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.665046 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d22e92-45bd-4d1e-954e-3ade801245d4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.665099 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.665147 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30d22e92-45bd-4d1e-954e-3ade801245d4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.665171 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30d22e92-45bd-4d1e-954e-3ade801245d4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.665539 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.665775 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30d22e92-45bd-4d1e-954e-3ade801245d4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.666616 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30d22e92-45bd-4d1e-954e-3ade801245d4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.666646 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30d22e92-45bd-4d1e-954e-3ade801245d4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.667037 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30d22e92-45bd-4d1e-954e-3ade801245d4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.670015 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30d22e92-45bd-4d1e-954e-3ade801245d4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.683755 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d22e92-45bd-4d1e-954e-3ade801245d4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.687315 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw629\" (UniqueName: \"kubernetes.io/projected/30d22e92-45bd-4d1e-954e-3ade801245d4-kube-api-access-vw629\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.688620 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"30d22e92-45bd-4d1e-954e-3ade801245d4\") " pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.740463 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.757893 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.758949 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.763363 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.763392 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-4xnqn" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.767430 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.776867 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.870446 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dsjx\" (UniqueName: \"kubernetes.io/projected/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-kube-api-access-9dsjx\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.870514 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-config-data\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.870561 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.870629 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.870725 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-kolla-config\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.972224 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-kolla-config\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.972309 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dsjx\" (UniqueName: \"kubernetes.io/projected/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-kube-api-access-9dsjx\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.972352 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-config-data\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.972383 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.972411 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.973307 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-kolla-config\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.973341 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-config-data\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.975916 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.987826 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:14 crc kubenswrapper[4839]: I0321 04:42:14.998859 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dsjx\" (UniqueName: \"kubernetes.io/projected/3c49bdbb-0c05-4dea-8de8-61ca09b7e84c-kube-api-access-9dsjx\") pod \"memcached-0\" (UID: \"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c\") " pod="openstack/memcached-0" Mar 21 04:42:15 crc kubenswrapper[4839]: I0321 04:42:15.074314 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 21 04:42:17 crc kubenswrapper[4839]: I0321 04:42:17.152323 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 04:42:17 crc kubenswrapper[4839]: I0321 04:42:17.153724 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 04:42:17 crc kubenswrapper[4839]: I0321 04:42:17.155824 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-cck8h" Mar 21 04:42:17 crc kubenswrapper[4839]: I0321 04:42:17.175950 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 04:42:17 crc kubenswrapper[4839]: I0321 04:42:17.219090 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr9w9\" (UniqueName: \"kubernetes.io/projected/76b8f1b8-aa66-4f5e-937a-f837a2da28f1-kube-api-access-fr9w9\") pod \"kube-state-metrics-0\" (UID: \"76b8f1b8-aa66-4f5e-937a-f837a2da28f1\") " pod="openstack/kube-state-metrics-0" Mar 21 04:42:17 crc kubenswrapper[4839]: I0321 04:42:17.325649 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr9w9\" (UniqueName: \"kubernetes.io/projected/76b8f1b8-aa66-4f5e-937a-f837a2da28f1-kube-api-access-fr9w9\") pod \"kube-state-metrics-0\" (UID: \"76b8f1b8-aa66-4f5e-937a-f837a2da28f1\") " pod="openstack/kube-state-metrics-0" Mar 21 04:42:17 crc kubenswrapper[4839]: I0321 04:42:17.348816 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr9w9\" (UniqueName: \"kubernetes.io/projected/76b8f1b8-aa66-4f5e-937a-f837a2da28f1-kube-api-access-fr9w9\") pod \"kube-state-metrics-0\" (UID: \"76b8f1b8-aa66-4f5e-937a-f837a2da28f1\") " pod="openstack/kube-state-metrics-0" Mar 21 04:42:17 crc kubenswrapper[4839]: I0321 04:42:17.473084 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.425252 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qt5s4"] Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.426499 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.429029 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.430439 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-62scq" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.431537 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.437420 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qt5s4"] Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.478740 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-var-run\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.478814 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-combined-ca-bundle\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.478880 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-var-log-ovn\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.478907 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-var-run-ovn\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.478928 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-scripts\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.478969 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-ovn-controller-tls-certs\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.479030 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvg5s\" (UniqueName: \"kubernetes.io/projected/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-kube-api-access-kvg5s\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.485233 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-hrww8"] Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.486753 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.513115 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hrww8"] Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580007 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvg5s\" (UniqueName: \"kubernetes.io/projected/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-kube-api-access-kvg5s\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580058 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-var-log\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580087 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-var-run\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580110 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-var-run\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580140 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-combined-ca-bundle\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580163 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-etc-ovs\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580196 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ll56\" (UniqueName: \"kubernetes.io/projected/3d74e911-e100-4e79-89be-202e06bb4d30-kube-api-access-9ll56\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580215 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-var-log-ovn\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580233 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-var-run-ovn\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580248 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-scripts\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580277 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-ovn-controller-tls-certs\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580294 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-var-lib\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.580311 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d74e911-e100-4e79-89be-202e06bb4d30-scripts\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.581264 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-var-log-ovn\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.581345 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-var-run\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.581453 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-var-run-ovn\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.582685 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-scripts\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.593692 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-ovn-controller-tls-certs\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.594538 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-combined-ca-bundle\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.601711 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvg5s\" (UniqueName: \"kubernetes.io/projected/b31b64cb-0266-4b8a-9fcb-ae5e36c8309a-kube-api-access-kvg5s\") pod \"ovn-controller-qt5s4\" (UID: \"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a\") " pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.682193 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-var-lib\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.682253 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d74e911-e100-4e79-89be-202e06bb4d30-scripts\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.682330 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-var-log\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.682370 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-var-run\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.682414 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-etc-ovs\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.682493 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ll56\" (UniqueName: \"kubernetes.io/projected/3d74e911-e100-4e79-89be-202e06bb4d30-kube-api-access-9ll56\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.683653 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-var-lib\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.683733 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-var-run\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.683815 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-var-log\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.683923 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3d74e911-e100-4e79-89be-202e06bb4d30-etc-ovs\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.685472 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d74e911-e100-4e79-89be-202e06bb4d30-scripts\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.701006 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ll56\" (UniqueName: \"kubernetes.io/projected/3d74e911-e100-4e79-89be-202e06bb4d30-kube-api-access-9ll56\") pod \"ovn-controller-ovs-hrww8\" (UID: \"3d74e911-e100-4e79-89be-202e06bb4d30\") " pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.758048 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:20 crc kubenswrapper[4839]: I0321 04:42:20.805808 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:21 crc kubenswrapper[4839]: E0321 04:42:21.930113 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 21 04:42:21 crc kubenswrapper[4839]: E0321 04:42:21.930584 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vkmxw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-zn278_openstack(ef534e12-a75b-4e9b-b2a3-4046bece5903): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:42:21 crc kubenswrapper[4839]: E0321 04:42:21.935630 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" podUID="ef534e12-a75b-4e9b-b2a3-4046bece5903" Mar 21 04:42:21 crc kubenswrapper[4839]: E0321 04:42:21.953694 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 21 04:42:21 crc kubenswrapper[4839]: E0321 04:42:21.954006 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grztj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-7tj8n_openstack(2f135620-f512-4bfe-9875-4d4e07c6a0f5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:42:21 crc kubenswrapper[4839]: E0321 04:42:21.955858 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" podUID="2f135620-f512-4bfe-9875-4d4e07c6a0f5" Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.363519 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.524083 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkmxw\" (UniqueName: \"kubernetes.io/projected/ef534e12-a75b-4e9b-b2a3-4046bece5903-kube-api-access-vkmxw\") pod \"ef534e12-a75b-4e9b-b2a3-4046bece5903\" (UID: \"ef534e12-a75b-4e9b-b2a3-4046bece5903\") " Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.524330 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef534e12-a75b-4e9b-b2a3-4046bece5903-config\") pod \"ef534e12-a75b-4e9b-b2a3-4046bece5903\" (UID: \"ef534e12-a75b-4e9b-b2a3-4046bece5903\") " Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.524806 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef534e12-a75b-4e9b-b2a3-4046bece5903-config" (OuterVolumeSpecName: "config") pod "ef534e12-a75b-4e9b-b2a3-4046bece5903" (UID: "ef534e12-a75b-4e9b-b2a3-4046bece5903"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.546088 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef534e12-a75b-4e9b-b2a3-4046bece5903-kube-api-access-vkmxw" (OuterVolumeSpecName: "kube-api-access-vkmxw") pod "ef534e12-a75b-4e9b-b2a3-4046bece5903" (UID: "ef534e12-a75b-4e9b-b2a3-4046bece5903"). InnerVolumeSpecName "kube-api-access-vkmxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.626673 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkmxw\" (UniqueName: \"kubernetes.io/projected/ef534e12-a75b-4e9b-b2a3-4046bece5903-kube-api-access-vkmxw\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.626708 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef534e12-a75b-4e9b-b2a3-4046bece5903-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.649868 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hpzj8"] Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.658479 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.676376 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.688748 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.698177 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.704442 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 04:42:22 crc kubenswrapper[4839]: W0321 04:42:22.720446 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e1d0e8c_00aa_4770_9e58_b8f706d80a35.slice/crio-13cf1811708e735c8587e5f387524078eddb6176802aa11ecbd1435c38ed0541 WatchSource:0}: Error finding container 13cf1811708e735c8587e5f387524078eddb6176802aa11ecbd1435c38ed0541: Status 404 returned error can't find the container with id 13cf1811708e735c8587e5f387524078eddb6176802aa11ecbd1435c38ed0541 Mar 21 04:42:22 crc kubenswrapper[4839]: W0321 04:42:22.730852 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8028561c_b039_4400_a065_b5efee753b5f.slice/crio-5eaf787d4b2014f872ad6aefa43fc8d3d3baab1a1f0af69a0017de992e3a8b54 WatchSource:0}: Error finding container 5eaf787d4b2014f872ad6aefa43fc8d3d3baab1a1f0af69a0017de992e3a8b54: Status 404 returned error can't find the container with id 5eaf787d4b2014f872ad6aefa43fc8d3d3baab1a1f0af69a0017de992e3a8b54 Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.865720 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.964600 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30d22e92-45bd-4d1e-954e-3ade801245d4","Type":"ContainerStarted","Data":"9b8ab073370c7e9b4ef0d33f05bad1959de23b13a08aa77b184763533f460bc8"} Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.966321 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.966363 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-zn278" event={"ID":"ef534e12-a75b-4e9b-b2a3-4046bece5903","Type":"ContainerDied","Data":"8d9c14a410c4252b3bc652d39bc6ba19cb56d790ccb813a314300ee40264a16c"} Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.977799 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" event={"ID":"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5","Type":"ContainerStarted","Data":"0ce222ce6f31634231e7d8b6628241b56c0a1c3b1c7afeb55d6d347fe6bd4c49"} Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.979295 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c","Type":"ContainerStarted","Data":"73cd9e67027e3c3a25c526ec7264951bd83d32e50e75829a33891f7f554d78d2"} Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.987116 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qt5s4"] Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.988328 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4f1edf0d-f220-4815-aeb6-e4507576247a","Type":"ContainerStarted","Data":"92d8cbd04f02626e8199703ada8650eafd375a40b634369189fe2eae5e79b310"} Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.991243 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6e1d0e8c-00aa-4770-9e58-b8f706d80a35","Type":"ContainerStarted","Data":"13cf1811708e735c8587e5f387524078eddb6176802aa11ecbd1435c38ed0541"} Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.993432 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8028561c-b039-4400-a065-b5efee753b5f","Type":"ContainerStarted","Data":"5eaf787d4b2014f872ad6aefa43fc8d3d3baab1a1f0af69a0017de992e3a8b54"} Mar 21 04:42:22 crc kubenswrapper[4839]: I0321 04:42:22.996323 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"76b8f1b8-aa66-4f5e-937a-f837a2da28f1","Type":"ContainerStarted","Data":"c3e02332eed0f6ac50479a637c2f9551186161a99dab978e61007f6da0cf9aba"} Mar 21 04:42:23 crc kubenswrapper[4839]: W0321 04:42:23.025325 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45f7903c_4081_446b_94d9_ad979332590b.slice/crio-104ea53b53097b885fd70bb32ecb2748f3dd610f76b7554afac7980d69d50aa2 WatchSource:0}: Error finding container 104ea53b53097b885fd70bb32ecb2748f3dd610f76b7554afac7980d69d50aa2: Status 404 returned error can't find the container with id 104ea53b53097b885fd70bb32ecb2748f3dd610f76b7554afac7980d69d50aa2 Mar 21 04:42:23 crc kubenswrapper[4839]: W0321 04:42:23.027475 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d74e911_e100_4e79_89be_202e06bb4d30.slice/crio-7717e327dcaa2d021e550fe3f1735a11ebeed797ea33dd4e818cb7806408989f WatchSource:0}: Error finding container 7717e327dcaa2d021e550fe3f1735a11ebeed797ea33dd4e818cb7806408989f: Status 404 returned error can't find the container with id 7717e327dcaa2d021e550fe3f1735a11ebeed797ea33dd4e818cb7806408989f Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.030621 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hrww8"] Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.042791 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-d99qf"] Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.117615 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zn278"] Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.130017 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zn278"] Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.279676 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-mx5tf"] Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.283491 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.289160 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.289445 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.289704 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-mx5tf"] Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.328426 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.450288 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-config\") pod \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.450366 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grztj\" (UniqueName: \"kubernetes.io/projected/2f135620-f512-4bfe-9875-4d4e07c6a0f5-kube-api-access-grztj\") pod \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.450524 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-dns-svc\") pod \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\" (UID: \"2f135620-f512-4bfe-9875-4d4e07c6a0f5\") " Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.450812 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/64d13111-845e-4c61-a4ce-483ddfb799b7-ovn-rundir\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.450856 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrpd6\" (UniqueName: \"kubernetes.io/projected/64d13111-845e-4c61-a4ce-483ddfb799b7-kube-api-access-mrpd6\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.450886 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/64d13111-845e-4c61-a4ce-483ddfb799b7-ovs-rundir\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.451004 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64d13111-845e-4c61-a4ce-483ddfb799b7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.451046 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64d13111-845e-4c61-a4ce-483ddfb799b7-combined-ca-bundle\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.451066 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-config" (OuterVolumeSpecName: "config") pod "2f135620-f512-4bfe-9875-4d4e07c6a0f5" (UID: "2f135620-f512-4bfe-9875-4d4e07c6a0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.451110 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64d13111-845e-4c61-a4ce-483ddfb799b7-config\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.451334 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2f135620-f512-4bfe-9875-4d4e07c6a0f5" (UID: "2f135620-f512-4bfe-9875-4d4e07c6a0f5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.455347 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f135620-f512-4bfe-9875-4d4e07c6a0f5-kube-api-access-grztj" (OuterVolumeSpecName: "kube-api-access-grztj") pod "2f135620-f512-4bfe-9875-4d4e07c6a0f5" (UID: "2f135620-f512-4bfe-9875-4d4e07c6a0f5"). InnerVolumeSpecName "kube-api-access-grztj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.952076 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/64d13111-845e-4c61-a4ce-483ddfb799b7-ovn-rundir\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.952171 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrpd6\" (UniqueName: \"kubernetes.io/projected/64d13111-845e-4c61-a4ce-483ddfb799b7-kube-api-access-mrpd6\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.952234 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/64d13111-845e-4c61-a4ce-483ddfb799b7-ovs-rundir\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.952330 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64d13111-845e-4c61-a4ce-483ddfb799b7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.952407 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64d13111-845e-4c61-a4ce-483ddfb799b7-combined-ca-bundle\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.952480 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64d13111-845e-4c61-a4ce-483ddfb799b7-config\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.952598 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.952615 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grztj\" (UniqueName: \"kubernetes.io/projected/2f135620-f512-4bfe-9875-4d4e07c6a0f5-kube-api-access-grztj\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.952631 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f135620-f512-4bfe-9875-4d4e07c6a0f5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.955665 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.956788 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.958385 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/64d13111-845e-4c61-a4ce-483ddfb799b7-ovs-rundir\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.963101 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.963942 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-f7k5z" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.964583 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.965230 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.965718 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64d13111-845e-4c61-a4ce-483ddfb799b7-config\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.965901 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/64d13111-845e-4c61-a4ce-483ddfb799b7-ovn-rundir\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.967687 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.968644 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64d13111-845e-4c61-a4ce-483ddfb799b7-combined-ca-bundle\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.969147 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64d13111-845e-4c61-a4ce-483ddfb799b7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:23 crc kubenswrapper[4839]: I0321 04:42:23.981720 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrpd6\" (UniqueName: \"kubernetes.io/projected/64d13111-845e-4c61-a4ce-483ddfb799b7-kube-api-access-mrpd6\") pod \"ovn-controller-metrics-mx5tf\" (UID: \"64d13111-845e-4c61-a4ce-483ddfb799b7\") " pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.004507 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" event={"ID":"45f7903c-4081-446b-94d9-ad979332590b","Type":"ContainerStarted","Data":"104ea53b53097b885fd70bb32ecb2748f3dd610f76b7554afac7980d69d50aa2"} Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.008634 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qt5s4" event={"ID":"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a","Type":"ContainerStarted","Data":"3f05c669d23667bcf282f85eaf3b128b498ba04ca003dea9bc418e7987ec280b"} Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.009970 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" event={"ID":"2f135620-f512-4bfe-9875-4d4e07c6a0f5","Type":"ContainerDied","Data":"2be9891bd895defb2ca0d28836ebf421c8763119934d2eefdf49a40b80778096"} Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.010058 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.012250 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hrww8" event={"ID":"3d74e911-e100-4e79-89be-202e06bb4d30","Type":"ContainerStarted","Data":"7717e327dcaa2d021e550fe3f1735a11ebeed797ea33dd4e818cb7806408989f"} Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.109809 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.111026 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.118205 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-65bwr" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.118486 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.118551 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.118775 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.122792 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.157910 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-config\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.158836 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.158891 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.158947 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.158975 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.159004 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.159036 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.159104 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xwmw\" (UniqueName: \"kubernetes.io/projected/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-kube-api-access-5xwmw\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.212684 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mx5tf" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.265692 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a7a1028-3deb-4033-890c-db0861c6a9a2-config\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.265756 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a7a1028-3deb-4033-890c-db0861c6a9a2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.270829 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.270860 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a7a1028-3deb-4033-890c-db0861c6a9a2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.270885 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.270916 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.270938 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a7a1028-3deb-4033-890c-db0861c6a9a2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.270991 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xwmw\" (UniqueName: \"kubernetes.io/projected/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-kube-api-access-5xwmw\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.271011 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7a1028-3deb-4033-890c-db0861c6a9a2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.271040 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-config\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.271064 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.271106 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.271131 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.271169 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4a7a1028-3deb-4033-890c-db0861c6a9a2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.271191 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9ccz\" (UniqueName: \"kubernetes.io/projected/4a7a1028-3deb-4033-890c-db0861c6a9a2-kube-api-access-s9ccz\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.271211 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.272033 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.274351 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-config\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.274355 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.275847 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.276865 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.285456 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.287693 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xwmw\" (UniqueName: \"kubernetes.io/projected/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-kube-api-access-5xwmw\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.294069 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c2e5ef4-e4c0-4278-897e-ce5d00b4079d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.294341 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d\") " pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.425047 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7a1028-3deb-4033-890c-db0861c6a9a2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.425836 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.425978 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4a7a1028-3deb-4033-890c-db0861c6a9a2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.426086 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9ccz\" (UniqueName: \"kubernetes.io/projected/4a7a1028-3deb-4033-890c-db0861c6a9a2-kube-api-access-s9ccz\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.426204 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a7a1028-3deb-4033-890c-db0861c6a9a2-config\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.426361 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a7a1028-3deb-4033-890c-db0861c6a9a2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.426511 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a7a1028-3deb-4033-890c-db0861c6a9a2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.426634 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a7a1028-3deb-4033-890c-db0861c6a9a2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.426645 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4a7a1028-3deb-4033-890c-db0861c6a9a2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.426287 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.429887 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7a1028-3deb-4033-890c-db0861c6a9a2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.430096 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a7a1028-3deb-4033-890c-db0861c6a9a2-config\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.430533 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a7a1028-3deb-4033-890c-db0861c6a9a2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.431762 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a7a1028-3deb-4033-890c-db0861c6a9a2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.433585 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a7a1028-3deb-4033-890c-db0861c6a9a2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.445603 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9ccz\" (UniqueName: \"kubernetes.io/projected/4a7a1028-3deb-4033-890c-db0861c6a9a2-kube-api-access-s9ccz\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.452661 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4a7a1028-3deb-4033-890c-db0861c6a9a2\") " pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.462456 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef534e12-a75b-4e9b-b2a3-4046bece5903" path="/var/lib/kubelet/pods/ef534e12-a75b-4e9b-b2a3-4046bece5903/volumes" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.524501 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:24 crc kubenswrapper[4839]: I0321 04:42:24.532865 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:26 crc kubenswrapper[4839]: I0321 04:42:26.728322 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 21 04:42:26 crc kubenswrapper[4839]: I0321 04:42:26.763210 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-mx5tf"] Mar 21 04:42:27 crc kubenswrapper[4839]: I0321 04:42:27.637415 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 21 04:42:29 crc kubenswrapper[4839]: W0321 04:42:29.105210 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c2e5ef4_e4c0_4278_897e_ce5d00b4079d.slice/crio-d9a27b85c9cd913ab75f08dfedcf1fddb423f9ed676b8a4adcd4d45cc577f713 WatchSource:0}: Error finding container d9a27b85c9cd913ab75f08dfedcf1fddb423f9ed676b8a4adcd4d45cc577f713: Status 404 returned error can't find the container with id d9a27b85c9cd913ab75f08dfedcf1fddb423f9ed676b8a4adcd4d45cc577f713 Mar 21 04:42:29 crc kubenswrapper[4839]: W0321 04:42:29.106433 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a7a1028_3deb_4033_890c_db0861c6a9a2.slice/crio-a8bf4c720e0e6b3f3fd84af1321aca246839afe8f37873fe14892ee36dcca921 WatchSource:0}: Error finding container a8bf4c720e0e6b3f3fd84af1321aca246839afe8f37873fe14892ee36dcca921: Status 404 returned error can't find the container with id a8bf4c720e0e6b3f3fd84af1321aca246839afe8f37873fe14892ee36dcca921 Mar 21 04:42:30 crc kubenswrapper[4839]: I0321 04:42:30.057178 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d","Type":"ContainerStarted","Data":"d9a27b85c9cd913ab75f08dfedcf1fddb423f9ed676b8a4adcd4d45cc577f713"} Mar 21 04:42:30 crc kubenswrapper[4839]: I0321 04:42:30.058961 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4a7a1028-3deb-4033-890c-db0861c6a9a2","Type":"ContainerStarted","Data":"a8bf4c720e0e6b3f3fd84af1321aca246839afe8f37873fe14892ee36dcca921"} Mar 21 04:42:30 crc kubenswrapper[4839]: I0321 04:42:30.060287 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mx5tf" event={"ID":"64d13111-845e-4c61-a4ce-483ddfb799b7","Type":"ContainerStarted","Data":"af359b866288950544e18ecda8a727bc0e66ac01a39c017d435cd3186e9fc456"} Mar 21 04:42:30 crc kubenswrapper[4839]: I0321 04:42:30.980616 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:42:30 crc kubenswrapper[4839]: I0321 04:42:30.980673 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:42:30 crc kubenswrapper[4839]: I0321 04:42:30.980719 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:42:30 crc kubenswrapper[4839]: I0321 04:42:30.981387 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ca17db50991abbb7e584e1a028ac5195afd6abd747f7e5e9969a64ed39bcf6c"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:42:30 crc kubenswrapper[4839]: I0321 04:42:30.981439 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://3ca17db50991abbb7e584e1a028ac5195afd6abd747f7e5e9969a64ed39bcf6c" gracePeriod=600 Mar 21 04:42:31 crc kubenswrapper[4839]: I0321 04:42:31.675885 4839 scope.go:117] "RemoveContainer" containerID="edf0b9b310ad11f4cb21b959eb633d808203a45ec2b8463a2fe875186e107484" Mar 21 04:42:32 crc kubenswrapper[4839]: I0321 04:42:32.081815 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="3ca17db50991abbb7e584e1a028ac5195afd6abd747f7e5e9969a64ed39bcf6c" exitCode=0 Mar 21 04:42:32 crc kubenswrapper[4839]: I0321 04:42:32.081854 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"3ca17db50991abbb7e584e1a028ac5195afd6abd747f7e5e9969a64ed39bcf6c"} Mar 21 04:42:32 crc kubenswrapper[4839]: I0321 04:42:32.081921 4839 scope.go:117] "RemoveContainer" containerID="d9f640234dbdc5d617b2a0974e24e968076d94c55d65466d46d7d064392afc02" Mar 21 04:42:42 crc kubenswrapper[4839]: I0321 04:42:42.158974 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" event={"ID":"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5","Type":"ContainerStarted","Data":"f34f4acac62bc779d36b74541949c853bc07653f2853d959f243208237be8360"} Mar 21 04:42:42 crc kubenswrapper[4839]: I0321 04:42:42.162348 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"48bb6d2443587cf3023178aa72ea424c113f55b1e7600821dbf21c214de8e70f"} Mar 21 04:42:43 crc kubenswrapper[4839]: I0321 04:42:43.172553 4839 generic.go:334] "Generic (PLEG): container finished" podID="5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" containerID="f34f4acac62bc779d36b74541949c853bc07653f2853d959f243208237be8360" exitCode=0 Mar 21 04:42:43 crc kubenswrapper[4839]: I0321 04:42:43.172652 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" event={"ID":"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5","Type":"ContainerDied","Data":"f34f4acac62bc779d36b74541949c853bc07653f2853d959f243208237be8360"} Mar 21 04:42:43 crc kubenswrapper[4839]: E0321 04:42:43.704107 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 21 04:42:43 crc kubenswrapper[4839]: E0321 04:42:43.704503 4839 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 21 04:42:43 crc kubenswrapper[4839]: E0321 04:42:43.704671 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fr9w9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(76b8f1b8-aa66-4f5e-937a-f837a2da28f1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:42:43 crc kubenswrapper[4839]: E0321 04:42:43.705868 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="76b8f1b8-aa66-4f5e-937a-f837a2da28f1" Mar 21 04:42:44 crc kubenswrapper[4839]: E0321 04:42:44.184941 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="76b8f1b8-aa66-4f5e-937a-f837a2da28f1" Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.195628 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" event={"ID":"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5","Type":"ContainerStarted","Data":"3bcf6c4dca6994d55b402560e153aeec77a770a87db6c2a3981fd71f1c0e7782"} Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.196235 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.197786 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qt5s4" event={"ID":"b31b64cb-0266-4b8a-9fcb-ae5e36c8309a","Type":"ContainerStarted","Data":"03174b199080cd580b420f87aeff54e80e4f62879ff8261601e9b256242195a2"} Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.197926 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-qt5s4" Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.199807 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3c49bdbb-0c05-4dea-8de8-61ca09b7e84c","Type":"ContainerStarted","Data":"0b8b7584ab62d33c68dbb641dd3ecf4023cf10a177e28f93543a064496569031"} Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.199922 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.201859 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d","Type":"ContainerStarted","Data":"f47aaf50b193096eb277420228011f3f9ef749e81e13ef2453a74549a72a6b4a"} Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.203592 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4f1edf0d-f220-4815-aeb6-e4507576247a","Type":"ContainerStarted","Data":"eb929a7642759119016130bc2484941db41a93bf8dd7c08b25451605be57abdb"} Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.207151 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30d22e92-45bd-4d1e-954e-3ade801245d4","Type":"ContainerStarted","Data":"03af0871636dda05a3e9679ab089635f4fe936a9737c3d6ca310aaab50787692"} Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.209174 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4a7a1028-3deb-4033-890c-db0861c6a9a2","Type":"ContainerStarted","Data":"435e1ac136be0d7f0e75d3d6e51aaa96829af337f68d82ee84841b76ec92423a"} Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.210820 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6e1d0e8c-00aa-4770-9e58-b8f706d80a35","Type":"ContainerStarted","Data":"e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182"} Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.212620 4839 generic.go:334] "Generic (PLEG): container finished" podID="3d74e911-e100-4e79-89be-202e06bb4d30" containerID="dbd877256a7c5b5bd86e79c64adcea4e8fc78bd1bc828444b55643dd18222c51" exitCode=0 Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.212701 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hrww8" event={"ID":"3d74e911-e100-4e79-89be-202e06bb4d30","Type":"ContainerDied","Data":"dbd877256a7c5b5bd86e79c64adcea4e8fc78bd1bc828444b55643dd18222c51"} Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.217760 4839 generic.go:334] "Generic (PLEG): container finished" podID="45f7903c-4081-446b-94d9-ad979332590b" containerID="ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b" exitCode=0 Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.217823 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" event={"ID":"45f7903c-4081-446b-94d9-ad979332590b","Type":"ContainerDied","Data":"ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b"} Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.238734 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" podStartSLOduration=31.900222346 podStartE2EDuration="35.238713681s" podCreationTimestamp="2026-03-21 04:42:10 +0000 UTC" firstStartedPulling="2026-03-21 04:42:22.678487073 +0000 UTC m=+1147.006273749" lastFinishedPulling="2026-03-21 04:42:26.016978408 +0000 UTC m=+1150.344765084" observedRunningTime="2026-03-21 04:42:45.216897041 +0000 UTC m=+1169.544683727" watchObservedRunningTime="2026-03-21 04:42:45.238713681 +0000 UTC m=+1169.566500357" Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.242150 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.515870427 podStartE2EDuration="31.242133567s" podCreationTimestamp="2026-03-21 04:42:14 +0000 UTC" firstStartedPulling="2026-03-21 04:42:22.700201561 +0000 UTC m=+1147.027988237" lastFinishedPulling="2026-03-21 04:42:35.426464691 +0000 UTC m=+1159.754251377" observedRunningTime="2026-03-21 04:42:45.232901299 +0000 UTC m=+1169.560687995" watchObservedRunningTime="2026-03-21 04:42:45.242133567 +0000 UTC m=+1169.569920243" Mar 21 04:42:45 crc kubenswrapper[4839]: I0321 04:42:45.282446 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-qt5s4" podStartSLOduration=9.766836292 podStartE2EDuration="25.282426494s" podCreationTimestamp="2026-03-21 04:42:20 +0000 UTC" firstStartedPulling="2026-03-21 04:42:22.987765975 +0000 UTC m=+1147.315552651" lastFinishedPulling="2026-03-21 04:42:38.503356177 +0000 UTC m=+1162.831142853" observedRunningTime="2026-03-21 04:42:45.280216182 +0000 UTC m=+1169.608002878" watchObservedRunningTime="2026-03-21 04:42:45.282426494 +0000 UTC m=+1169.610213180" Mar 21 04:42:46 crc kubenswrapper[4839]: I0321 04:42:46.258662 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8028561c-b039-4400-a065-b5efee753b5f","Type":"ContainerStarted","Data":"fcd7e300ab111a88b888a2fc68f007c49d0404de0648aa1177c5d04bb341e74c"} Mar 21 04:42:47 crc kubenswrapper[4839]: I0321 04:42:47.284725 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mx5tf" event={"ID":"64d13111-845e-4c61-a4ce-483ddfb799b7","Type":"ContainerStarted","Data":"175bbeed186af8e0c1144e458ee1210b3823c52893d44d05376e0576ec7042d3"} Mar 21 04:42:47 crc kubenswrapper[4839]: I0321 04:42:47.288938 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hrww8" event={"ID":"3d74e911-e100-4e79-89be-202e06bb4d30","Type":"ContainerStarted","Data":"dc90889004e71da508eff82ba8b3e962b8dde614cbf9d560b36062af63991346"} Mar 21 04:42:47 crc kubenswrapper[4839]: I0321 04:42:47.292026 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" event={"ID":"45f7903c-4081-446b-94d9-ad979332590b","Type":"ContainerStarted","Data":"ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5"} Mar 21 04:42:47 crc kubenswrapper[4839]: I0321 04:42:47.292241 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:47 crc kubenswrapper[4839]: I0321 04:42:47.314414 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" podStartSLOduration=24.808359938 podStartE2EDuration="37.314393327s" podCreationTimestamp="2026-03-21 04:42:10 +0000 UTC" firstStartedPulling="2026-03-21 04:42:23.029159383 +0000 UTC m=+1147.356946059" lastFinishedPulling="2026-03-21 04:42:35.535192772 +0000 UTC m=+1159.862979448" observedRunningTime="2026-03-21 04:42:47.311040424 +0000 UTC m=+1171.638827130" watchObservedRunningTime="2026-03-21 04:42:47.314393327 +0000 UTC m=+1171.642180013" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.305560 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hrww8" event={"ID":"3d74e911-e100-4e79-89be-202e06bb4d30","Type":"ContainerStarted","Data":"c545a523b095359d2d8c61304f37592d2cebcf5b76db7972a0012ae265bb8e7e"} Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.306312 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.308060 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4a7a1028-3deb-4033-890c-db0861c6a9a2","Type":"ContainerStarted","Data":"8cef22d68ec78af1eb35427e9159a065e50b724bcca1f49f2f330f9757887e18"} Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.345212 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-hrww8" podStartSLOduration=15.280689572 podStartE2EDuration="28.345186044s" podCreationTimestamp="2026-03-21 04:42:20 +0000 UTC" firstStartedPulling="2026-03-21 04:42:23.029895014 +0000 UTC m=+1147.357681690" lastFinishedPulling="2026-03-21 04:42:36.094391486 +0000 UTC m=+1160.422178162" observedRunningTime="2026-03-21 04:42:48.328032744 +0000 UTC m=+1172.655819420" watchObservedRunningTime="2026-03-21 04:42:48.345186044 +0000 UTC m=+1172.672972730" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.354600 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=7.285219283 podStartE2EDuration="25.354548216s" podCreationTimestamp="2026-03-21 04:42:23 +0000 UTC" firstStartedPulling="2026-03-21 04:42:29.630032054 +0000 UTC m=+1153.957818770" lastFinishedPulling="2026-03-21 04:42:47.699361017 +0000 UTC m=+1172.027147703" observedRunningTime="2026-03-21 04:42:48.350829852 +0000 UTC m=+1172.678616538" watchObservedRunningTime="2026-03-21 04:42:48.354548216 +0000 UTC m=+1172.682334912" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.533367 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.582268 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.604675 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-mx5tf" podStartSLOduration=9.028040949 podStartE2EDuration="25.604642972s" podCreationTimestamp="2026-03-21 04:42:23 +0000 UTC" firstStartedPulling="2026-03-21 04:42:29.629741626 +0000 UTC m=+1153.957528312" lastFinishedPulling="2026-03-21 04:42:46.206343659 +0000 UTC m=+1170.534130335" observedRunningTime="2026-03-21 04:42:48.37866829 +0000 UTC m=+1172.706454956" watchObservedRunningTime="2026-03-21 04:42:48.604642972 +0000 UTC m=+1172.932429648" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.785868 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hpzj8"] Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.786553 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" podUID="5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" containerName="dnsmasq-dns" containerID="cri-o://3bcf6c4dca6994d55b402560e153aeec77a770a87db6c2a3981fd71f1c0e7782" gracePeriod=10 Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.826170 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-vxcbj"] Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.827426 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.830241 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.845701 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-vxcbj"] Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.944709 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dqqx\" (UniqueName: \"kubernetes.io/projected/40fc2c3c-2c23-497a-89d6-906ba78506c2-kube-api-access-9dqqx\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.944833 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.944891 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-config\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:48 crc kubenswrapper[4839]: I0321 04:42:48.944991 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.000911 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-d99qf"] Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.047199 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dqqx\" (UniqueName: \"kubernetes.io/projected/40fc2c3c-2c23-497a-89d6-906ba78506c2-kube-api-access-9dqqx\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.047256 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.047319 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-config\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.047387 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.048344 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.048500 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.048636 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-config\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.082711 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dqqx\" (UniqueName: \"kubernetes.io/projected/40fc2c3c-2c23-497a-89d6-906ba78506c2-kube-api-access-9dqqx\") pod \"dnsmasq-dns-6bc7876d45-vxcbj\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.084834 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-4z7nl"] Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.086278 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.093056 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.094917 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-4z7nl"] Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.148446 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-config\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.148512 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-dns-svc\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.148540 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j56r2\" (UniqueName: \"kubernetes.io/projected/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-kube-api-access-j56r2\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.148597 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.148638 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.168881 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.258170 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-config\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.258248 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-dns-svc\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.258273 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j56r2\" (UniqueName: \"kubernetes.io/projected/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-kube-api-access-j56r2\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.258330 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.258383 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.276394 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.284488 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-config\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.290527 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-dns-svc\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.292111 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j56r2\" (UniqueName: \"kubernetes.io/projected/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-kube-api-access-j56r2\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.295714 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-4z7nl\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.323092 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8c2e5ef4-e4c0-4278-897e-ce5d00b4079d","Type":"ContainerStarted","Data":"dfd3e2ddc3b5f48fb0549e59e6def9e56a5ded8e1205e64ee9554fc4e94a80b9"} Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.326205 4839 generic.go:334] "Generic (PLEG): container finished" podID="5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" containerID="3bcf6c4dca6994d55b402560e153aeec77a770a87db6c2a3981fd71f1c0e7782" exitCode=0 Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.326293 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" event={"ID":"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5","Type":"ContainerDied","Data":"3bcf6c4dca6994d55b402560e153aeec77a770a87db6c2a3981fd71f1c0e7782"} Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.326942 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.326973 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.327076 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" podUID="45f7903c-4081-446b-94d9-ad979332590b" containerName="dnsmasq-dns" containerID="cri-o://ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5" gracePeriod=10 Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.395058 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.421753 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.726753314 podStartE2EDuration="27.421730099s" podCreationTimestamp="2026-03-21 04:42:22 +0000 UTC" firstStartedPulling="2026-03-21 04:42:29.629630573 +0000 UTC m=+1153.957417249" lastFinishedPulling="2026-03-21 04:42:48.324607338 +0000 UTC m=+1172.652394034" observedRunningTime="2026-03-21 04:42:49.358300105 +0000 UTC m=+1173.686086801" watchObservedRunningTime="2026-03-21 04:42:49.421730099 +0000 UTC m=+1173.749516775" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.482740 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.516731 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.525147 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.667272 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-config\") pod \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.667704 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cxnx\" (UniqueName: \"kubernetes.io/projected/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-kube-api-access-6cxnx\") pod \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.667771 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-dns-svc\") pod \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\" (UID: \"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5\") " Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.673488 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-kube-api-access-6cxnx" (OuterVolumeSpecName: "kube-api-access-6cxnx") pod "5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" (UID: "5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5"). InnerVolumeSpecName "kube-api-access-6cxnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.712645 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" (UID: "5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.738084 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-config" (OuterVolumeSpecName: "config") pod "5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" (UID: "5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.752864 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-vxcbj"] Mar 21 04:42:49 crc kubenswrapper[4839]: W0321 04:42:49.756410 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40fc2c3c_2c23_497a_89d6_906ba78506c2.slice/crio-19aecc8d605d42905ae54d5640d84373129c9b80e1392d6792f10bef52bd1252 WatchSource:0}: Error finding container 19aecc8d605d42905ae54d5640d84373129c9b80e1392d6792f10bef52bd1252: Status 404 returned error can't find the container with id 19aecc8d605d42905ae54d5640d84373129c9b80e1392d6792f10bef52bd1252 Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.767915 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.769955 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.769989 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cxnx\" (UniqueName: \"kubernetes.io/projected/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-kube-api-access-6cxnx\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.770007 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.871941 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-dns-svc\") pod \"45f7903c-4081-446b-94d9-ad979332590b\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.872442 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-config\") pod \"45f7903c-4081-446b-94d9-ad979332590b\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.872528 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djkvq\" (UniqueName: \"kubernetes.io/projected/45f7903c-4081-446b-94d9-ad979332590b-kube-api-access-djkvq\") pod \"45f7903c-4081-446b-94d9-ad979332590b\" (UID: \"45f7903c-4081-446b-94d9-ad979332590b\") " Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.876699 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45f7903c-4081-446b-94d9-ad979332590b-kube-api-access-djkvq" (OuterVolumeSpecName: "kube-api-access-djkvq") pod "45f7903c-4081-446b-94d9-ad979332590b" (UID: "45f7903c-4081-446b-94d9-ad979332590b"). InnerVolumeSpecName "kube-api-access-djkvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.911479 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "45f7903c-4081-446b-94d9-ad979332590b" (UID: "45f7903c-4081-446b-94d9-ad979332590b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.919679 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-config" (OuterVolumeSpecName: "config") pod "45f7903c-4081-446b-94d9-ad979332590b" (UID: "45f7903c-4081-446b-94d9-ad979332590b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.974657 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djkvq\" (UniqueName: \"kubernetes.io/projected/45f7903c-4081-446b-94d9-ad979332590b-kube-api-access-djkvq\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.974711 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:49 crc kubenswrapper[4839]: I0321 04:42:49.974723 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45f7903c-4081-446b-94d9-ad979332590b-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.012740 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-4z7nl"] Mar 21 04:42:50 crc kubenswrapper[4839]: W0321 04:42:50.015905 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd92c95bc_4fde_4d24_ad6e_d4583ec19b3a.slice/crio-8e39c8706b3815a73340c1ae1ac875a1799d9491908ef6b336359be187e60c9d WatchSource:0}: Error finding container 8e39c8706b3815a73340c1ae1ac875a1799d9491908ef6b336359be187e60c9d: Status 404 returned error can't find the container with id 8e39c8706b3815a73340c1ae1ac875a1799d9491908ef6b336359be187e60c9d Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.075316 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.339211 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-4z7nl" event={"ID":"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a","Type":"ContainerStarted","Data":"8e39c8706b3815a73340c1ae1ac875a1799d9491908ef6b336359be187e60c9d"} Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.341039 4839 generic.go:334] "Generic (PLEG): container finished" podID="40fc2c3c-2c23-497a-89d6-906ba78506c2" containerID="8a45ce0a9f6faa9e05d38583bc564141b514dd0db35ef0ce33729091517ea300" exitCode=0 Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.341104 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" event={"ID":"40fc2c3c-2c23-497a-89d6-906ba78506c2","Type":"ContainerDied","Data":"8a45ce0a9f6faa9e05d38583bc564141b514dd0db35ef0ce33729091517ea300"} Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.341125 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" event={"ID":"40fc2c3c-2c23-497a-89d6-906ba78506c2","Type":"ContainerStarted","Data":"19aecc8d605d42905ae54d5640d84373129c9b80e1392d6792f10bef52bd1252"} Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.344667 4839 generic.go:334] "Generic (PLEG): container finished" podID="45f7903c-4081-446b-94d9-ad979332590b" containerID="ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5" exitCode=0 Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.344734 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" event={"ID":"45f7903c-4081-446b-94d9-ad979332590b","Type":"ContainerDied","Data":"ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5"} Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.344740 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.344756 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-d99qf" event={"ID":"45f7903c-4081-446b-94d9-ad979332590b","Type":"ContainerDied","Data":"104ea53b53097b885fd70bb32ecb2748f3dd610f76b7554afac7980d69d50aa2"} Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.344778 4839 scope.go:117] "RemoveContainer" containerID="ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.348554 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" event={"ID":"5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5","Type":"ContainerDied","Data":"0ce222ce6f31634231e7d8b6628241b56c0a1c3b1c7afeb55d6d347fe6bd4c49"} Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.349237 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hpzj8" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.451961 4839 scope.go:117] "RemoveContainer" containerID="ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.493528 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hpzj8"] Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.500226 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hpzj8"] Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.505619 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-d99qf"] Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.509690 4839 scope.go:117] "RemoveContainer" containerID="ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5" Mar 21 04:42:50 crc kubenswrapper[4839]: E0321 04:42:50.510168 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5\": container with ID starting with ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5 not found: ID does not exist" containerID="ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.510204 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5"} err="failed to get container status \"ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5\": rpc error: code = NotFound desc = could not find container \"ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5\": container with ID starting with ceb5288368bd1daf0f9565c9514c1f663aa6c3ba57ee1a29a443b28ffe10ccc5 not found: ID does not exist" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.510227 4839 scope.go:117] "RemoveContainer" containerID="ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b" Mar 21 04:42:50 crc kubenswrapper[4839]: E0321 04:42:50.510531 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b\": container with ID starting with ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b not found: ID does not exist" containerID="ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.510587 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b"} err="failed to get container status \"ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b\": rpc error: code = NotFound desc = could not find container \"ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b\": container with ID starting with ca652ab32b31e13e16160470923dab804165a2c796e14ceaeace470aa8e62f9b not found: ID does not exist" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.510614 4839 scope.go:117] "RemoveContainer" containerID="3bcf6c4dca6994d55b402560e153aeec77a770a87db6c2a3981fd71f1c0e7782" Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.513002 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-d99qf"] Mar 21 04:42:50 crc kubenswrapper[4839]: I0321 04:42:50.540789 4839 scope.go:117] "RemoveContainer" containerID="f34f4acac62bc779d36b74541949c853bc07653f2853d959f243208237be8360" Mar 21 04:42:51 crc kubenswrapper[4839]: I0321 04:42:51.359300 4839 generic.go:334] "Generic (PLEG): container finished" podID="d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" containerID="fdec34f8addba6741ac12f007463e3379c5cafeea2de83548fa2bc44a873ebd5" exitCode=0 Mar 21 04:42:51 crc kubenswrapper[4839]: I0321 04:42:51.359397 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-4z7nl" event={"ID":"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a","Type":"ContainerDied","Data":"fdec34f8addba6741ac12f007463e3379c5cafeea2de83548fa2bc44a873ebd5"} Mar 21 04:42:51 crc kubenswrapper[4839]: I0321 04:42:51.362434 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" event={"ID":"40fc2c3c-2c23-497a-89d6-906ba78506c2","Type":"ContainerStarted","Data":"5f6671d09fc3d7cf05752820e4458b198d84c62a10462305fe2e9de3a8094910"} Mar 21 04:42:51 crc kubenswrapper[4839]: I0321 04:42:51.362901 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:51 crc kubenswrapper[4839]: I0321 04:42:51.409522 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" podStartSLOduration=3.409503788 podStartE2EDuration="3.409503788s" podCreationTimestamp="2026-03-21 04:42:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:42:51.401994478 +0000 UTC m=+1175.729781194" watchObservedRunningTime="2026-03-21 04:42:51.409503788 +0000 UTC m=+1175.737290474" Mar 21 04:42:51 crc kubenswrapper[4839]: I0321 04:42:51.525668 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:51 crc kubenswrapper[4839]: I0321 04:42:51.592737 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.371429 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-4z7nl" event={"ID":"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a","Type":"ContainerStarted","Data":"97084a1051c26c4cfcbee9a2a345f9fe3d46b532fdf62d0b9b04772413ec0e3b"} Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.372016 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.393558 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-4z7nl" podStartSLOduration=3.393537077 podStartE2EDuration="3.393537077s" podCreationTimestamp="2026-03-21 04:42:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:42:52.387823697 +0000 UTC m=+1176.715610393" watchObservedRunningTime="2026-03-21 04:42:52.393537077 +0000 UTC m=+1176.721323753" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.412583 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.469279 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45f7903c-4081-446b-94d9-ad979332590b" path="/var/lib/kubelet/pods/45f7903c-4081-446b-94d9-ad979332590b/volumes" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.469882 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" path="/var/lib/kubelet/pods/5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5/volumes" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.550274 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 21 04:42:52 crc kubenswrapper[4839]: E0321 04:42:52.550649 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" containerName="dnsmasq-dns" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.550671 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" containerName="dnsmasq-dns" Mar 21 04:42:52 crc kubenswrapper[4839]: E0321 04:42:52.550713 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" containerName="init" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.550720 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" containerName="init" Mar 21 04:42:52 crc kubenswrapper[4839]: E0321 04:42:52.550733 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f7903c-4081-446b-94d9-ad979332590b" containerName="dnsmasq-dns" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.550740 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f7903c-4081-446b-94d9-ad979332590b" containerName="dnsmasq-dns" Mar 21 04:42:52 crc kubenswrapper[4839]: E0321 04:42:52.550757 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f7903c-4081-446b-94d9-ad979332590b" containerName="init" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.550763 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f7903c-4081-446b-94d9-ad979332590b" containerName="init" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.550918 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f0d86dd-f2c4-415e-b4dc-0bfeb0b1dbb5" containerName="dnsmasq-dns" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.550934 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="45f7903c-4081-446b-94d9-ad979332590b" containerName="dnsmasq-dns" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.551745 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.553929 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.554406 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-c8b6d" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.557234 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.562370 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.582407 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.623454 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbcaa531-3e09-48c7-8535-76f3e1f5c303-config\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.623510 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbcaa531-3e09-48c7-8535-76f3e1f5c303-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.623605 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dbcaa531-3e09-48c7-8535-76f3e1f5c303-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.623673 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snwvj\" (UniqueName: \"kubernetes.io/projected/dbcaa531-3e09-48c7-8535-76f3e1f5c303-kube-api-access-snwvj\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.623712 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbcaa531-3e09-48c7-8535-76f3e1f5c303-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.623737 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbcaa531-3e09-48c7-8535-76f3e1f5c303-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.623754 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbcaa531-3e09-48c7-8535-76f3e1f5c303-scripts\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.725101 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbcaa531-3e09-48c7-8535-76f3e1f5c303-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.725396 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dbcaa531-3e09-48c7-8535-76f3e1f5c303-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.725500 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snwvj\" (UniqueName: \"kubernetes.io/projected/dbcaa531-3e09-48c7-8535-76f3e1f5c303-kube-api-access-snwvj\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.725620 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbcaa531-3e09-48c7-8535-76f3e1f5c303-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.725711 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbcaa531-3e09-48c7-8535-76f3e1f5c303-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.725781 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbcaa531-3e09-48c7-8535-76f3e1f5c303-scripts\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.725873 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbcaa531-3e09-48c7-8535-76f3e1f5c303-config\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.726018 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dbcaa531-3e09-48c7-8535-76f3e1f5c303-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.726729 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbcaa531-3e09-48c7-8535-76f3e1f5c303-config\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.726779 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbcaa531-3e09-48c7-8535-76f3e1f5c303-scripts\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.730895 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbcaa531-3e09-48c7-8535-76f3e1f5c303-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.734082 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbcaa531-3e09-48c7-8535-76f3e1f5c303-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.737535 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbcaa531-3e09-48c7-8535-76f3e1f5c303-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.751537 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snwvj\" (UniqueName: \"kubernetes.io/projected/dbcaa531-3e09-48c7-8535-76f3e1f5c303-kube-api-access-snwvj\") pod \"ovn-northd-0\" (UID: \"dbcaa531-3e09-48c7-8535-76f3e1f5c303\") " pod="openstack/ovn-northd-0" Mar 21 04:42:52 crc kubenswrapper[4839]: I0321 04:42:52.869625 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 21 04:42:53 crc kubenswrapper[4839]: I0321 04:42:53.124641 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 21 04:42:53 crc kubenswrapper[4839]: I0321 04:42:53.379716 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dbcaa531-3e09-48c7-8535-76f3e1f5c303","Type":"ContainerStarted","Data":"cb26c02e4b67b2ee72c3a057d0d13e316aa7baf49fd4350ce7ca2df46971a652"} Mar 21 04:42:54 crc kubenswrapper[4839]: I0321 04:42:54.510718 4839 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod2f135620-f512-4bfe-9875-4d4e07c6a0f5"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod2f135620-f512-4bfe-9875-4d4e07c6a0f5] : Timed out while waiting for systemd to remove kubepods-besteffort-pod2f135620_f512_4bfe_9875_4d4e07c6a0f5.slice" Mar 21 04:42:54 crc kubenswrapper[4839]: E0321 04:42:54.510805 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod2f135620-f512-4bfe-9875-4d4e07c6a0f5] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod2f135620-f512-4bfe-9875-4d4e07c6a0f5] : Timed out while waiting for systemd to remove kubepods-besteffort-pod2f135620_f512_4bfe_9875_4d4e07c6a0f5.slice" pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" podUID="2f135620-f512-4bfe-9875-4d4e07c6a0f5" Mar 21 04:42:55 crc kubenswrapper[4839]: I0321 04:42:55.397184 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-7tj8n" Mar 21 04:42:55 crc kubenswrapper[4839]: I0321 04:42:55.397213 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dbcaa531-3e09-48c7-8535-76f3e1f5c303","Type":"ContainerStarted","Data":"c22c6e1a7be56bf8725b63f090a6cdc87ec4c4d7a1e8cf8cc014466210946c61"} Mar 21 04:42:55 crc kubenswrapper[4839]: I0321 04:42:55.397678 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dbcaa531-3e09-48c7-8535-76f3e1f5c303","Type":"ContainerStarted","Data":"3b217217438dd5c4ebb0a93f1eac576c17e0299263c356ee52a23ab7cb389bc3"} Mar 21 04:42:55 crc kubenswrapper[4839]: I0321 04:42:55.398008 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 21 04:42:55 crc kubenswrapper[4839]: I0321 04:42:55.425531 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.8447265449999999 podStartE2EDuration="3.425512838s" podCreationTimestamp="2026-03-21 04:42:52 +0000 UTC" firstStartedPulling="2026-03-21 04:42:53.133021994 +0000 UTC m=+1177.460808670" lastFinishedPulling="2026-03-21 04:42:54.713808287 +0000 UTC m=+1179.041594963" observedRunningTime="2026-03-21 04:42:55.423867732 +0000 UTC m=+1179.751654408" watchObservedRunningTime="2026-03-21 04:42:55.425512838 +0000 UTC m=+1179.753299524" Mar 21 04:42:55 crc kubenswrapper[4839]: I0321 04:42:55.485978 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7tj8n"] Mar 21 04:42:55 crc kubenswrapper[4839]: I0321 04:42:55.494169 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-7tj8n"] Mar 21 04:42:56 crc kubenswrapper[4839]: I0321 04:42:56.460949 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f135620-f512-4bfe-9875-4d4e07c6a0f5" path="/var/lib/kubelet/pods/2f135620-f512-4bfe-9875-4d4e07c6a0f5/volumes" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.287856 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-vxcbj"] Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.289303 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" podUID="40fc2c3c-2c23-497a-89d6-906ba78506c2" containerName="dnsmasq-dns" containerID="cri-o://5f6671d09fc3d7cf05752820e4458b198d84c62a10462305fe2e9de3a8094910" gracePeriod=10 Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.294634 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.319025 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zkqb7"] Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.320279 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.335555 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.335666 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-config\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.335704 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksgp4\" (UniqueName: \"kubernetes.io/projected/67dd1633-1450-4153-b0af-b6887f61944c-kube-api-access-ksgp4\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.335742 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.335879 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.357758 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zkqb7"] Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.437050 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.437123 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.437187 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-config\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.437220 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksgp4\" (UniqueName: \"kubernetes.io/projected/67dd1633-1450-4153-b0af-b6887f61944c-kube-api-access-ksgp4\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.437260 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.438287 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.438852 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.439373 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.439887 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-config\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.458837 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksgp4\" (UniqueName: \"kubernetes.io/projected/67dd1633-1450-4153-b0af-b6887f61944c-kube-api-access-ksgp4\") pod \"dnsmasq-dns-b8fbc5445-zkqb7\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:58 crc kubenswrapper[4839]: I0321 04:42:58.638991 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.106107 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zkqb7"] Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.125527 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f1edf0d-f220-4815-aeb6-e4507576247a" containerID="eb929a7642759119016130bc2484941db41a93bf8dd7c08b25451605be57abdb" exitCode=0 Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.125937 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4f1edf0d-f220-4815-aeb6-e4507576247a","Type":"ContainerDied","Data":"eb929a7642759119016130bc2484941db41a93bf8dd7c08b25451605be57abdb"} Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.133438 4839 generic.go:334] "Generic (PLEG): container finished" podID="30d22e92-45bd-4d1e-954e-3ade801245d4" containerID="03af0871636dda05a3e9679ab089635f4fe936a9737c3d6ca310aaab50787692" exitCode=0 Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.133536 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30d22e92-45bd-4d1e-954e-3ade801245d4","Type":"ContainerDied","Data":"03af0871636dda05a3e9679ab089635f4fe936a9737c3d6ca310aaab50787692"} Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.138495 4839 generic.go:334] "Generic (PLEG): container finished" podID="40fc2c3c-2c23-497a-89d6-906ba78506c2" containerID="5f6671d09fc3d7cf05752820e4458b198d84c62a10462305fe2e9de3a8094910" exitCode=0 Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.138539 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" event={"ID":"40fc2c3c-2c23-497a-89d6-906ba78506c2","Type":"ContainerDied","Data":"5f6671d09fc3d7cf05752820e4458b198d84c62a10462305fe2e9de3a8094910"} Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.176532 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" podUID="40fc2c3c-2c23-497a-89d6-906ba78506c2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.387549 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.392707 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.395812 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-jcqm2" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.396053 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.407853 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.407870 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.413610 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.483757 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.554897 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9848d2f0-c562-4b2a-bd1c-cd91c6754079-cache\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.555212 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drfd2\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-kube-api-access-drfd2\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.555259 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.555309 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.555349 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9848d2f0-c562-4b2a-bd1c-cd91c6754079-lock\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.555365 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9848d2f0-c562-4b2a-bd1c-cd91c6754079-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.657466 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9848d2f0-c562-4b2a-bd1c-cd91c6754079-cache\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.657527 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drfd2\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-kube-api-access-drfd2\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.657555 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.657623 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.657653 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9848d2f0-c562-4b2a-bd1c-cd91c6754079-lock\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.657667 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9848d2f0-c562-4b2a-bd1c-cd91c6754079-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: E0321 04:42:59.657886 4839 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 04:42:59 crc kubenswrapper[4839]: E0321 04:42:59.657920 4839 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 04:42:59 crc kubenswrapper[4839]: E0321 04:42:59.657982 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift podName:9848d2f0-c562-4b2a-bd1c-cd91c6754079 nodeName:}" failed. No retries permitted until 2026-03-21 04:43:00.157962832 +0000 UTC m=+1184.485749508 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift") pod "swift-storage-0" (UID: "9848d2f0-c562-4b2a-bd1c-cd91c6754079") : configmap "swift-ring-files" not found Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.658101 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.658471 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9848d2f0-c562-4b2a-bd1c-cd91c6754079-cache\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.659587 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9848d2f0-c562-4b2a-bd1c-cd91c6754079-lock\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.664053 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9848d2f0-c562-4b2a-bd1c-cd91c6754079-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.674436 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drfd2\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-kube-api-access-drfd2\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:42:59 crc kubenswrapper[4839]: I0321 04:42:59.683385 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:43:00 crc kubenswrapper[4839]: I0321 04:43:00.149790 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" event={"ID":"67dd1633-1450-4153-b0af-b6887f61944c","Type":"ContainerStarted","Data":"57de16c4224a656e8f3fcae76650a94702fb081fd5f9e8c3856fcde976889201"} Mar 21 04:43:00 crc kubenswrapper[4839]: I0321 04:43:00.165743 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:43:00 crc kubenswrapper[4839]: E0321 04:43:00.166049 4839 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 04:43:00 crc kubenswrapper[4839]: E0321 04:43:00.166092 4839 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 04:43:00 crc kubenswrapper[4839]: E0321 04:43:00.166302 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift podName:9848d2f0-c562-4b2a-bd1c-cd91c6754079 nodeName:}" failed. No retries permitted until 2026-03-21 04:43:01.166279461 +0000 UTC m=+1185.494066137 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift") pod "swift-storage-0" (UID: "9848d2f0-c562-4b2a-bd1c-cd91c6754079") : configmap "swift-ring-files" not found Mar 21 04:43:01 crc kubenswrapper[4839]: I0321 04:43:01.200098 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:43:01 crc kubenswrapper[4839]: E0321 04:43:01.200315 4839 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 04:43:01 crc kubenswrapper[4839]: E0321 04:43:01.200396 4839 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 04:43:01 crc kubenswrapper[4839]: E0321 04:43:01.200449 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift podName:9848d2f0-c562-4b2a-bd1c-cd91c6754079 nodeName:}" failed. No retries permitted until 2026-03-21 04:43:03.200434762 +0000 UTC m=+1187.528221438 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift") pod "swift-storage-0" (UID: "9848d2f0-c562-4b2a-bd1c-cd91c6754079") : configmap "swift-ring-files" not found Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.166194 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30d22e92-45bd-4d1e-954e-3ade801245d4","Type":"ContainerStarted","Data":"d0d0cc3b9992c79ba662d880c6be2ed4e505bdd6e757aa9952909ab5d979d31f"} Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.168346 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" event={"ID":"67dd1633-1450-4153-b0af-b6887f61944c","Type":"ContainerStarted","Data":"285d767665dbf1b22bee7f8005f18b61072968dd727d608ba30f4f564d8882bb"} Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.170512 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4f1edf0d-f220-4815-aeb6-e4507576247a","Type":"ContainerStarted","Data":"067f116e13aa69b91b4d2ca31991a45e297831881328d0b0917f54a9ef074313"} Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.401388 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.523994 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dqqx\" (UniqueName: \"kubernetes.io/projected/40fc2c3c-2c23-497a-89d6-906ba78506c2-kube-api-access-9dqqx\") pod \"40fc2c3c-2c23-497a-89d6-906ba78506c2\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.524588 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-dns-svc\") pod \"40fc2c3c-2c23-497a-89d6-906ba78506c2\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.524651 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-config\") pod \"40fc2c3c-2c23-497a-89d6-906ba78506c2\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.524683 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-ovsdbserver-sb\") pod \"40fc2c3c-2c23-497a-89d6-906ba78506c2\" (UID: \"40fc2c3c-2c23-497a-89d6-906ba78506c2\") " Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.530025 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40fc2c3c-2c23-497a-89d6-906ba78506c2-kube-api-access-9dqqx" (OuterVolumeSpecName: "kube-api-access-9dqqx") pod "40fc2c3c-2c23-497a-89d6-906ba78506c2" (UID: "40fc2c3c-2c23-497a-89d6-906ba78506c2"). InnerVolumeSpecName "kube-api-access-9dqqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.565250 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "40fc2c3c-2c23-497a-89d6-906ba78506c2" (UID: "40fc2c3c-2c23-497a-89d6-906ba78506c2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.569513 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-config" (OuterVolumeSpecName: "config") pod "40fc2c3c-2c23-497a-89d6-906ba78506c2" (UID: "40fc2c3c-2c23-497a-89d6-906ba78506c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.574376 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "40fc2c3c-2c23-497a-89d6-906ba78506c2" (UID: "40fc2c3c-2c23-497a-89d6-906ba78506c2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.627998 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.628073 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.628094 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dqqx\" (UniqueName: \"kubernetes.io/projected/40fc2c3c-2c23-497a-89d6-906ba78506c2-kube-api-access-9dqqx\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:02 crc kubenswrapper[4839]: I0321 04:43:02.628108 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40fc2c3c-2c23-497a-89d6-906ba78506c2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.178205 4839 generic.go:334] "Generic (PLEG): container finished" podID="67dd1633-1450-4153-b0af-b6887f61944c" containerID="285d767665dbf1b22bee7f8005f18b61072968dd727d608ba30f4f564d8882bb" exitCode=0 Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.178268 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" event={"ID":"67dd1633-1450-4153-b0af-b6887f61944c","Type":"ContainerDied","Data":"285d767665dbf1b22bee7f8005f18b61072968dd727d608ba30f4f564d8882bb"} Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.180437 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.180469 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-vxcbj" event={"ID":"40fc2c3c-2c23-497a-89d6-906ba78506c2","Type":"ContainerDied","Data":"19aecc8d605d42905ae54d5640d84373129c9b80e1392d6792f10bef52bd1252"} Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.180505 4839 scope.go:117] "RemoveContainer" containerID="5f6671d09fc3d7cf05752820e4458b198d84c62a10462305fe2e9de3a8094910" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.183121 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"76b8f1b8-aa66-4f5e-937a-f837a2da28f1","Type":"ContainerStarted","Data":"ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b"} Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.183638 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.207717 4839 scope.go:117] "RemoveContainer" containerID="8a45ce0a9f6faa9e05d38583bc564141b514dd0db35ef0ce33729091517ea300" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.238312 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=38.045040249 podStartE2EDuration="51.238292313s" podCreationTimestamp="2026-03-21 04:42:12 +0000 UTC" firstStartedPulling="2026-03-21 04:42:22.702505945 +0000 UTC m=+1147.030292621" lastFinishedPulling="2026-03-21 04:42:35.895758009 +0000 UTC m=+1160.223544685" observedRunningTime="2026-03-21 04:43:03.223979852 +0000 UTC m=+1187.551766538" watchObservedRunningTime="2026-03-21 04:43:03.238292313 +0000 UTC m=+1187.566078989" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.241538 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:43:03 crc kubenswrapper[4839]: E0321 04:43:03.241868 4839 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 04:43:03 crc kubenswrapper[4839]: E0321 04:43:03.241888 4839 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 04:43:03 crc kubenswrapper[4839]: E0321 04:43:03.241932 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift podName:9848d2f0-c562-4b2a-bd1c-cd91c6754079 nodeName:}" failed. No retries permitted until 2026-03-21 04:43:07.241916224 +0000 UTC m=+1191.569702900 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift") pod "swift-storage-0" (UID: "9848d2f0-c562-4b2a-bd1c-cd91c6754079") : configmap "swift-ring-files" not found Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.252626 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=6.398790814 podStartE2EDuration="46.252605513s" podCreationTimestamp="2026-03-21 04:42:17 +0000 UTC" firstStartedPulling="2026-03-21 04:42:22.678151984 +0000 UTC m=+1147.005938660" lastFinishedPulling="2026-03-21 04:43:02.531966683 +0000 UTC m=+1186.859753359" observedRunningTime="2026-03-21 04:43:03.240946577 +0000 UTC m=+1187.568733253" watchObservedRunningTime="2026-03-21 04:43:03.252605513 +0000 UTC m=+1187.580392189" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.274014 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=35.118178033 podStartE2EDuration="50.273998121s" podCreationTimestamp="2026-03-21 04:42:13 +0000 UTC" firstStartedPulling="2026-03-21 04:42:22.899368502 +0000 UTC m=+1147.227155178" lastFinishedPulling="2026-03-21 04:42:38.05518859 +0000 UTC m=+1162.382975266" observedRunningTime="2026-03-21 04:43:03.271705927 +0000 UTC m=+1187.599492603" watchObservedRunningTime="2026-03-21 04:43:03.273998121 +0000 UTC m=+1187.601784797" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.292908 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-vxcbj"] Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.298865 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-vxcbj"] Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.375612 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-grqqv"] Mar 21 04:43:03 crc kubenswrapper[4839]: E0321 04:43:03.376147 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40fc2c3c-2c23-497a-89d6-906ba78506c2" containerName="dnsmasq-dns" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.376181 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="40fc2c3c-2c23-497a-89d6-906ba78506c2" containerName="dnsmasq-dns" Mar 21 04:43:03 crc kubenswrapper[4839]: E0321 04:43:03.376220 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40fc2c3c-2c23-497a-89d6-906ba78506c2" containerName="init" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.376227 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="40fc2c3c-2c23-497a-89d6-906ba78506c2" containerName="init" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.376411 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="40fc2c3c-2c23-497a-89d6-906ba78506c2" containerName="dnsmasq-dns" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.379931 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.385198 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.385242 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.385786 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.411415 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-grqqv"] Mar 21 04:43:03 crc kubenswrapper[4839]: E0321 04:43:03.412354 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-698g5 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-698g5 ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-grqqv" podUID="40c8620e-6bc9-444b-8ab8-9428633490f3" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.426061 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-kkvzq"] Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.427326 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.446500 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-kkvzq"] Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.466792 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.467120 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.469408 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-grqqv"] Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547416 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-dispersionconf\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547520 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-scripts\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547544 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-dispersionconf\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547577 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-swiftconf\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547601 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-combined-ca-bundle\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547623 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/40c8620e-6bc9-444b-8ab8-9428633490f3-etc-swift\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547673 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-698g5\" (UniqueName: \"kubernetes.io/projected/40c8620e-6bc9-444b-8ab8-9428633490f3-kube-api-access-698g5\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547711 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-scripts\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547743 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-combined-ca-bundle\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547781 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5484abbf-53f2-445a-b6fe-0996eba95345-etc-swift\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547802 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-swiftconf\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547846 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-ring-data-devices\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547872 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-ring-data-devices\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.547911 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4552p\" (UniqueName: \"kubernetes.io/projected/5484abbf-53f2-445a-b6fe-0996eba95345-kube-api-access-4552p\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.649182 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-ring-data-devices\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.649548 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-ring-data-devices\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.649679 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4552p\" (UniqueName: \"kubernetes.io/projected/5484abbf-53f2-445a-b6fe-0996eba95345-kube-api-access-4552p\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.649780 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-dispersionconf\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.649918 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-scripts\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.650024 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-dispersionconf\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.650099 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-swiftconf\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.650174 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-combined-ca-bundle\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.650247 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/40c8620e-6bc9-444b-8ab8-9428633490f3-etc-swift\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.650323 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-698g5\" (UniqueName: \"kubernetes.io/projected/40c8620e-6bc9-444b-8ab8-9428633490f3-kube-api-access-698g5\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.650403 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-scripts\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.650484 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-combined-ca-bundle\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.650581 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-ring-data-devices\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.650589 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5484abbf-53f2-445a-b6fe-0996eba95345-etc-swift\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.650691 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-swiftconf\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.651119 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5484abbf-53f2-445a-b6fe-0996eba95345-etc-swift\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.651241 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-ring-data-devices\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.651548 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/40c8620e-6bc9-444b-8ab8-9428633490f3-etc-swift\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.651848 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-scripts\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.652080 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-scripts\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.654972 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-dispersionconf\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.655713 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-swiftconf\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.656329 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-combined-ca-bundle\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.658004 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-combined-ca-bundle\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.659073 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-swiftconf\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.663146 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-dispersionconf\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.669278 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4552p\" (UniqueName: \"kubernetes.io/projected/5484abbf-53f2-445a-b6fe-0996eba95345-kube-api-access-4552p\") pod \"swift-ring-rebalance-kkvzq\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.671604 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-698g5\" (UniqueName: \"kubernetes.io/projected/40c8620e-6bc9-444b-8ab8-9428633490f3-kube-api-access-698g5\") pod \"swift-ring-rebalance-grqqv\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:03 crc kubenswrapper[4839]: I0321 04:43:03.760121 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.200003 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" event={"ID":"67dd1633-1450-4153-b0af-b6887f61944c","Type":"ContainerStarted","Data":"66a460b182805c08827a7b4f6980d98fea84c8290c7b4fe1cb071b3630a6c029"} Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.200066 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.226547 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" podStartSLOduration=6.226529698 podStartE2EDuration="6.226529698s" podCreationTimestamp="2026-03-21 04:42:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:43:04.224171272 +0000 UTC m=+1188.551957968" watchObservedRunningTime="2026-03-21 04:43:04.226529698 +0000 UTC m=+1188.554316374" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.229589 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:04 crc kubenswrapper[4839]: W0321 04:43:04.264125 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5484abbf_53f2_445a_b6fe_0996eba95345.slice/crio-609cfec14cc280f7aa99193851e53d3aac7200e8836be9f73b80cc6653f3fdc5 WatchSource:0}: Error finding container 609cfec14cc280f7aa99193851e53d3aac7200e8836be9f73b80cc6653f3fdc5: Status 404 returned error can't find the container with id 609cfec14cc280f7aa99193851e53d3aac7200e8836be9f73b80cc6653f3fdc5 Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.268927 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-kkvzq"] Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.363738 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-dispersionconf\") pod \"40c8620e-6bc9-444b-8ab8-9428633490f3\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.364014 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-698g5\" (UniqueName: \"kubernetes.io/projected/40c8620e-6bc9-444b-8ab8-9428633490f3-kube-api-access-698g5\") pod \"40c8620e-6bc9-444b-8ab8-9428633490f3\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.364043 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/40c8620e-6bc9-444b-8ab8-9428633490f3-etc-swift\") pod \"40c8620e-6bc9-444b-8ab8-9428633490f3\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.364177 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-swiftconf\") pod \"40c8620e-6bc9-444b-8ab8-9428633490f3\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.364230 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-scripts\") pod \"40c8620e-6bc9-444b-8ab8-9428633490f3\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.364260 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-combined-ca-bundle\") pod \"40c8620e-6bc9-444b-8ab8-9428633490f3\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.364307 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-ring-data-devices\") pod \"40c8620e-6bc9-444b-8ab8-9428633490f3\" (UID: \"40c8620e-6bc9-444b-8ab8-9428633490f3\") " Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.365678 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40c8620e-6bc9-444b-8ab8-9428633490f3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "40c8620e-6bc9-444b-8ab8-9428633490f3" (UID: "40c8620e-6bc9-444b-8ab8-9428633490f3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.367128 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-scripts" (OuterVolumeSpecName: "scripts") pod "40c8620e-6bc9-444b-8ab8-9428633490f3" (UID: "40c8620e-6bc9-444b-8ab8-9428633490f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.367524 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "40c8620e-6bc9-444b-8ab8-9428633490f3" (UID: "40c8620e-6bc9-444b-8ab8-9428633490f3"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.371660 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "40c8620e-6bc9-444b-8ab8-9428633490f3" (UID: "40c8620e-6bc9-444b-8ab8-9428633490f3"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.371716 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c8620e-6bc9-444b-8ab8-9428633490f3-kube-api-access-698g5" (OuterVolumeSpecName: "kube-api-access-698g5") pod "40c8620e-6bc9-444b-8ab8-9428633490f3" (UID: "40c8620e-6bc9-444b-8ab8-9428633490f3"). InnerVolumeSpecName "kube-api-access-698g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.375853 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "40c8620e-6bc9-444b-8ab8-9428633490f3" (UID: "40c8620e-6bc9-444b-8ab8-9428633490f3"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.376005 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40c8620e-6bc9-444b-8ab8-9428633490f3" (UID: "40c8620e-6bc9-444b-8ab8-9428633490f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.462065 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40fc2c3c-2c23-497a-89d6-906ba78506c2" path="/var/lib/kubelet/pods/40fc2c3c-2c23-497a-89d6-906ba78506c2/volumes" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.466896 4839 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.466936 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.466948 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.466964 4839 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/40c8620e-6bc9-444b-8ab8-9428633490f3-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.466976 4839 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/40c8620e-6bc9-444b-8ab8-9428633490f3-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.466989 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-698g5\" (UniqueName: \"kubernetes.io/projected/40c8620e-6bc9-444b-8ab8-9428633490f3-kube-api-access-698g5\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.467000 4839 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/40c8620e-6bc9-444b-8ab8-9428633490f3-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.740930 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 21 04:43:04 crc kubenswrapper[4839]: I0321 04:43:04.740984 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 21 04:43:05 crc kubenswrapper[4839]: I0321 04:43:05.215029 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kkvzq" event={"ID":"5484abbf-53f2-445a-b6fe-0996eba95345","Type":"ContainerStarted","Data":"609cfec14cc280f7aa99193851e53d3aac7200e8836be9f73b80cc6653f3fdc5"} Mar 21 04:43:05 crc kubenswrapper[4839]: I0321 04:43:05.215059 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-grqqv" Mar 21 04:43:05 crc kubenswrapper[4839]: I0321 04:43:05.215428 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:43:05 crc kubenswrapper[4839]: I0321 04:43:05.251811 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-grqqv"] Mar 21 04:43:05 crc kubenswrapper[4839]: I0321 04:43:05.260623 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-grqqv"] Mar 21 04:43:06 crc kubenswrapper[4839]: I0321 04:43:06.464557 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40c8620e-6bc9-444b-8ab8-9428633490f3" path="/var/lib/kubelet/pods/40c8620e-6bc9-444b-8ab8-9428633490f3/volumes" Mar 21 04:43:07 crc kubenswrapper[4839]: I0321 04:43:07.312010 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:43:07 crc kubenswrapper[4839]: E0321 04:43:07.312266 4839 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 04:43:07 crc kubenswrapper[4839]: E0321 04:43:07.312310 4839 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 04:43:07 crc kubenswrapper[4839]: E0321 04:43:07.312377 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift podName:9848d2f0-c562-4b2a-bd1c-cd91c6754079 nodeName:}" failed. No retries permitted until 2026-03-21 04:43:15.312355975 +0000 UTC m=+1199.640142651 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift") pod "swift-storage-0" (UID: "9848d2f0-c562-4b2a-bd1c-cd91c6754079") : configmap "swift-ring-files" not found Mar 21 04:43:07 crc kubenswrapper[4839]: I0321 04:43:07.478939 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 21 04:43:08 crc kubenswrapper[4839]: I0321 04:43:08.640828 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:43:08 crc kubenswrapper[4839]: I0321 04:43:08.696820 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-4z7nl"] Mar 21 04:43:08 crc kubenswrapper[4839]: I0321 04:43:08.697071 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-4z7nl" podUID="d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" containerName="dnsmasq-dns" containerID="cri-o://97084a1051c26c4cfcbee9a2a345f9fe3d46b532fdf62d0b9b04772413ec0e3b" gracePeriod=10 Mar 21 04:43:09 crc kubenswrapper[4839]: I0321 04:43:09.483498 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-4z7nl" podUID="d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Mar 21 04:43:12 crc kubenswrapper[4839]: I0321 04:43:12.264483 4839 generic.go:334] "Generic (PLEG): container finished" podID="d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" containerID="97084a1051c26c4cfcbee9a2a345f9fe3d46b532fdf62d0b9b04772413ec0e3b" exitCode=0 Mar 21 04:43:12 crc kubenswrapper[4839]: I0321 04:43:12.264558 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-4z7nl" event={"ID":"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a","Type":"ContainerDied","Data":"97084a1051c26c4cfcbee9a2a345f9fe3d46b532fdf62d0b9b04772413ec0e3b"} Mar 21 04:43:12 crc kubenswrapper[4839]: I0321 04:43:12.932489 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 21 04:43:13 crc kubenswrapper[4839]: I0321 04:43:13.579386 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 21 04:43:13 crc kubenswrapper[4839]: I0321 04:43:13.657196 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.217902 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.283965 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-4z7nl" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.284383 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-4z7nl" event={"ID":"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a","Type":"ContainerDied","Data":"8e39c8706b3815a73340c1ae1ac875a1799d9491908ef6b336359be187e60c9d"} Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.285218 4839 scope.go:117] "RemoveContainer" containerID="97084a1051c26c4cfcbee9a2a345f9fe3d46b532fdf62d0b9b04772413ec0e3b" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.331721 4839 scope.go:117] "RemoveContainer" containerID="fdec34f8addba6741ac12f007463e3379c5cafeea2de83548fa2bc44a873ebd5" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.343463 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-nb\") pod \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.343531 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j56r2\" (UniqueName: \"kubernetes.io/projected/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-kube-api-access-j56r2\") pod \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.343560 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-sb\") pod \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.343774 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-config\") pod \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.343821 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-dns-svc\") pod \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\" (UID: \"d92c95bc-4fde-4d24-ad6e-d4583ec19b3a\") " Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.347486 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-kube-api-access-j56r2" (OuterVolumeSpecName: "kube-api-access-j56r2") pod "d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" (UID: "d92c95bc-4fde-4d24-ad6e-d4583ec19b3a"). InnerVolumeSpecName "kube-api-access-j56r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.387049 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" (UID: "d92c95bc-4fde-4d24-ad6e-d4583ec19b3a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.387784 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" (UID: "d92c95bc-4fde-4d24-ad6e-d4583ec19b3a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.393741 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" (UID: "d92c95bc-4fde-4d24-ad6e-d4583ec19b3a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.394824 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-config" (OuterVolumeSpecName: "config") pod "d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" (UID: "d92c95bc-4fde-4d24-ad6e-d4583ec19b3a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.445864 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.445908 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.445917 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.445929 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j56r2\" (UniqueName: \"kubernetes.io/projected/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-kube-api-access-j56r2\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.445937 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.605695 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-4z7nl"] Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.631907 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-4z7nl"] Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.878066 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 21 04:43:14 crc kubenswrapper[4839]: I0321 04:43:14.959003 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 21 04:43:15 crc kubenswrapper[4839]: I0321 04:43:15.290305 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kkvzq" event={"ID":"5484abbf-53f2-445a-b6fe-0996eba95345","Type":"ContainerStarted","Data":"93b552b909df831d40a3b2b56c1f6ba5babeea45e19a365649677dff6a5a3b56"} Mar 21 04:43:15 crc kubenswrapper[4839]: I0321 04:43:15.309911 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-kkvzq" podStartSLOduration=2.354275297 podStartE2EDuration="12.309885448s" podCreationTimestamp="2026-03-21 04:43:03 +0000 UTC" firstStartedPulling="2026-03-21 04:43:04.267727821 +0000 UTC m=+1188.595514497" lastFinishedPulling="2026-03-21 04:43:14.223337972 +0000 UTC m=+1198.551124648" observedRunningTime="2026-03-21 04:43:15.307552892 +0000 UTC m=+1199.635339578" watchObservedRunningTime="2026-03-21 04:43:15.309885448 +0000 UTC m=+1199.637672124" Mar 21 04:43:15 crc kubenswrapper[4839]: I0321 04:43:15.359600 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:43:15 crc kubenswrapper[4839]: E0321 04:43:15.359800 4839 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 21 04:43:15 crc kubenswrapper[4839]: E0321 04:43:15.359816 4839 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 21 04:43:15 crc kubenswrapper[4839]: E0321 04:43:15.360269 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift podName:9848d2f0-c562-4b2a-bd1c-cd91c6754079 nodeName:}" failed. No retries permitted until 2026-03-21 04:43:31.360214026 +0000 UTC m=+1215.688000702 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift") pod "swift-storage-0" (UID: "9848d2f0-c562-4b2a-bd1c-cd91c6754079") : configmap "swift-ring-files" not found Mar 21 04:43:15 crc kubenswrapper[4839]: I0321 04:43:15.791982 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-qt5s4" podUID="b31b64cb-0266-4b8a-9fcb-ae5e36c8309a" containerName="ovn-controller" probeResult="failure" output=< Mar 21 04:43:15 crc kubenswrapper[4839]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 21 04:43:15 crc kubenswrapper[4839]: > Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.110311 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-852d-account-create-update-nv5n7"] Mar 21 04:43:16 crc kubenswrapper[4839]: E0321 04:43:16.110646 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" containerName="dnsmasq-dns" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.110679 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" containerName="dnsmasq-dns" Mar 21 04:43:16 crc kubenswrapper[4839]: E0321 04:43:16.110698 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" containerName="init" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.110705 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" containerName="init" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.110870 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" containerName="dnsmasq-dns" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.111347 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-852d-account-create-update-nv5n7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.113635 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.119996 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-852d-account-create-update-nv5n7"] Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.162914 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-v4k9c"] Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.163859 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-v4k9c" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.173609 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnt44\" (UniqueName: \"kubernetes.io/projected/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-kube-api-access-wnt44\") pod \"keystone-852d-account-create-update-nv5n7\" (UID: \"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0\") " pod="openstack/keystone-852d-account-create-update-nv5n7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.173717 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-operator-scripts\") pod \"keystone-852d-account-create-update-nv5n7\" (UID: \"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0\") " pod="openstack/keystone-852d-account-create-update-nv5n7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.176547 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-v4k9c"] Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.274788 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49pjl\" (UniqueName: \"kubernetes.io/projected/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-kube-api-access-49pjl\") pod \"keystone-db-create-v4k9c\" (UID: \"e779c2ff-ee70-4779-b3fc-3b3bf87aff47\") " pod="openstack/keystone-db-create-v4k9c" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.275009 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnt44\" (UniqueName: \"kubernetes.io/projected/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-kube-api-access-wnt44\") pod \"keystone-852d-account-create-update-nv5n7\" (UID: \"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0\") " pod="openstack/keystone-852d-account-create-update-nv5n7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.275056 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-operator-scripts\") pod \"keystone-db-create-v4k9c\" (UID: \"e779c2ff-ee70-4779-b3fc-3b3bf87aff47\") " pod="openstack/keystone-db-create-v4k9c" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.275141 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-operator-scripts\") pod \"keystone-852d-account-create-update-nv5n7\" (UID: \"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0\") " pod="openstack/keystone-852d-account-create-update-nv5n7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.275919 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-operator-scripts\") pod \"keystone-852d-account-create-update-nv5n7\" (UID: \"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0\") " pod="openstack/keystone-852d-account-create-update-nv5n7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.293548 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnt44\" (UniqueName: \"kubernetes.io/projected/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-kube-api-access-wnt44\") pod \"keystone-852d-account-create-update-nv5n7\" (UID: \"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0\") " pod="openstack/keystone-852d-account-create-update-nv5n7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.322480 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-tnx95"] Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.324499 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tnx95" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.333092 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-tnx95"] Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.385975 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-operator-scripts\") pod \"keystone-db-create-v4k9c\" (UID: \"e779c2ff-ee70-4779-b3fc-3b3bf87aff47\") " pod="openstack/keystone-db-create-v4k9c" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.386049 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a6840a-2ece-4b8d-be60-caa89912db9f-operator-scripts\") pod \"placement-db-create-tnx95\" (UID: \"c9a6840a-2ece-4b8d-be60-caa89912db9f\") " pod="openstack/placement-db-create-tnx95" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.386166 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49pjl\" (UniqueName: \"kubernetes.io/projected/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-kube-api-access-49pjl\") pod \"keystone-db-create-v4k9c\" (UID: \"e779c2ff-ee70-4779-b3fc-3b3bf87aff47\") " pod="openstack/keystone-db-create-v4k9c" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.386199 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm69x\" (UniqueName: \"kubernetes.io/projected/c9a6840a-2ece-4b8d-be60-caa89912db9f-kube-api-access-fm69x\") pod \"placement-db-create-tnx95\" (UID: \"c9a6840a-2ece-4b8d-be60-caa89912db9f\") " pod="openstack/placement-db-create-tnx95" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.386928 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-operator-scripts\") pod \"keystone-db-create-v4k9c\" (UID: \"e779c2ff-ee70-4779-b3fc-3b3bf87aff47\") " pod="openstack/keystone-db-create-v4k9c" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.412674 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49pjl\" (UniqueName: \"kubernetes.io/projected/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-kube-api-access-49pjl\") pod \"keystone-db-create-v4k9c\" (UID: \"e779c2ff-ee70-4779-b3fc-3b3bf87aff47\") " pod="openstack/keystone-db-create-v4k9c" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.430911 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-852d-account-create-update-nv5n7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.437524 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-1eec-account-create-update-h7hp7"] Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.438667 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1eec-account-create-update-h7hp7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.441512 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.472983 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d92c95bc-4fde-4d24-ad6e-d4583ec19b3a" path="/var/lib/kubelet/pods/d92c95bc-4fde-4d24-ad6e-d4583ec19b3a/volumes" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.474893 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1eec-account-create-update-h7hp7"] Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.478375 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-v4k9c" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.487965 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frcpm\" (UniqueName: \"kubernetes.io/projected/59ce13a7-d2a6-4c54-908d-39d1511da50b-kube-api-access-frcpm\") pod \"placement-1eec-account-create-update-h7hp7\" (UID: \"59ce13a7-d2a6-4c54-908d-39d1511da50b\") " pod="openstack/placement-1eec-account-create-update-h7hp7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.488061 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a6840a-2ece-4b8d-be60-caa89912db9f-operator-scripts\") pod \"placement-db-create-tnx95\" (UID: \"c9a6840a-2ece-4b8d-be60-caa89912db9f\") " pod="openstack/placement-db-create-tnx95" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.488127 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ce13a7-d2a6-4c54-908d-39d1511da50b-operator-scripts\") pod \"placement-1eec-account-create-update-h7hp7\" (UID: \"59ce13a7-d2a6-4c54-908d-39d1511da50b\") " pod="openstack/placement-1eec-account-create-update-h7hp7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.488159 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm69x\" (UniqueName: \"kubernetes.io/projected/c9a6840a-2ece-4b8d-be60-caa89912db9f-kube-api-access-fm69x\") pod \"placement-db-create-tnx95\" (UID: \"c9a6840a-2ece-4b8d-be60-caa89912db9f\") " pod="openstack/placement-db-create-tnx95" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.489075 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a6840a-2ece-4b8d-be60-caa89912db9f-operator-scripts\") pod \"placement-db-create-tnx95\" (UID: \"c9a6840a-2ece-4b8d-be60-caa89912db9f\") " pod="openstack/placement-db-create-tnx95" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.513404 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm69x\" (UniqueName: \"kubernetes.io/projected/c9a6840a-2ece-4b8d-be60-caa89912db9f-kube-api-access-fm69x\") pod \"placement-db-create-tnx95\" (UID: \"c9a6840a-2ece-4b8d-be60-caa89912db9f\") " pod="openstack/placement-db-create-tnx95" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.591656 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ce13a7-d2a6-4c54-908d-39d1511da50b-operator-scripts\") pod \"placement-1eec-account-create-update-h7hp7\" (UID: \"59ce13a7-d2a6-4c54-908d-39d1511da50b\") " pod="openstack/placement-1eec-account-create-update-h7hp7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.592022 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frcpm\" (UniqueName: \"kubernetes.io/projected/59ce13a7-d2a6-4c54-908d-39d1511da50b-kube-api-access-frcpm\") pod \"placement-1eec-account-create-update-h7hp7\" (UID: \"59ce13a7-d2a6-4c54-908d-39d1511da50b\") " pod="openstack/placement-1eec-account-create-update-h7hp7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.593391 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ce13a7-d2a6-4c54-908d-39d1511da50b-operator-scripts\") pod \"placement-1eec-account-create-update-h7hp7\" (UID: \"59ce13a7-d2a6-4c54-908d-39d1511da50b\") " pod="openstack/placement-1eec-account-create-update-h7hp7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.619431 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frcpm\" (UniqueName: \"kubernetes.io/projected/59ce13a7-d2a6-4c54-908d-39d1511da50b-kube-api-access-frcpm\") pod \"placement-1eec-account-create-update-h7hp7\" (UID: \"59ce13a7-d2a6-4c54-908d-39d1511da50b\") " pod="openstack/placement-1eec-account-create-update-h7hp7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.690950 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tnx95" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.872376 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1eec-account-create-update-h7hp7" Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.950708 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-852d-account-create-update-nv5n7"] Mar 21 04:43:16 crc kubenswrapper[4839]: W0321 04:43:16.957785 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ad3cc08_174a_4164_aa38_3d7f6fbed0c0.slice/crio-fd06d64cd3e2f8af0c1e033552a48d6556973db424a6dde6732cb4af920ae9b3 WatchSource:0}: Error finding container fd06d64cd3e2f8af0c1e033552a48d6556973db424a6dde6732cb4af920ae9b3: Status 404 returned error can't find the container with id fd06d64cd3e2f8af0c1e033552a48d6556973db424a6dde6732cb4af920ae9b3 Mar 21 04:43:16 crc kubenswrapper[4839]: I0321 04:43:16.965712 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 21 04:43:17 crc kubenswrapper[4839]: I0321 04:43:17.028487 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-v4k9c"] Mar 21 04:43:17 crc kubenswrapper[4839]: I0321 04:43:17.134597 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-tnx95"] Mar 21 04:43:17 crc kubenswrapper[4839]: W0321 04:43:17.157210 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9a6840a_2ece_4b8d_be60_caa89912db9f.slice/crio-978b69facff26547540b897eeb4f977dfaff5f241499331330d6b7477a899fcb WatchSource:0}: Error finding container 978b69facff26547540b897eeb4f977dfaff5f241499331330d6b7477a899fcb: Status 404 returned error can't find the container with id 978b69facff26547540b897eeb4f977dfaff5f241499331330d6b7477a899fcb Mar 21 04:43:17 crc kubenswrapper[4839]: I0321 04:43:17.308995 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-v4k9c" event={"ID":"e779c2ff-ee70-4779-b3fc-3b3bf87aff47","Type":"ContainerStarted","Data":"048e9e28ac07a1e9124d69a89e17059f1d443023c6faf3348223cf9a7387e352"} Mar 21 04:43:17 crc kubenswrapper[4839]: I0321 04:43:17.309348 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-v4k9c" event={"ID":"e779c2ff-ee70-4779-b3fc-3b3bf87aff47","Type":"ContainerStarted","Data":"97bb4acf95a9a42df523280643f99bc7040e0d83179344bcc8f457ae232c2302"} Mar 21 04:43:17 crc kubenswrapper[4839]: I0321 04:43:17.319857 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tnx95" event={"ID":"c9a6840a-2ece-4b8d-be60-caa89912db9f","Type":"ContainerStarted","Data":"978b69facff26547540b897eeb4f977dfaff5f241499331330d6b7477a899fcb"} Mar 21 04:43:17 crc kubenswrapper[4839]: I0321 04:43:17.321580 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-852d-account-create-update-nv5n7" event={"ID":"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0","Type":"ContainerStarted","Data":"435dd7b699a596fb94e68ae9d7689a76011012e1ee2be4e567ac5a478d536eb6"} Mar 21 04:43:17 crc kubenswrapper[4839]: I0321 04:43:17.321612 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-852d-account-create-update-nv5n7" event={"ID":"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0","Type":"ContainerStarted","Data":"fd06d64cd3e2f8af0c1e033552a48d6556973db424a6dde6732cb4af920ae9b3"} Mar 21 04:43:17 crc kubenswrapper[4839]: I0321 04:43:17.327223 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-v4k9c" podStartSLOduration=1.327207793 podStartE2EDuration="1.327207793s" podCreationTimestamp="2026-03-21 04:43:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:43:17.325885256 +0000 UTC m=+1201.653671932" watchObservedRunningTime="2026-03-21 04:43:17.327207793 +0000 UTC m=+1201.654994469" Mar 21 04:43:17 crc kubenswrapper[4839]: I0321 04:43:17.345226 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1eec-account-create-update-h7hp7"] Mar 21 04:43:17 crc kubenswrapper[4839]: I0321 04:43:17.346246 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-852d-account-create-update-nv5n7" podStartSLOduration=1.346229485 podStartE2EDuration="1.346229485s" podCreationTimestamp="2026-03-21 04:43:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:43:17.341272247 +0000 UTC m=+1201.669058933" watchObservedRunningTime="2026-03-21 04:43:17.346229485 +0000 UTC m=+1201.674016161" Mar 21 04:43:17 crc kubenswrapper[4839]: W0321 04:43:17.349933 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59ce13a7_d2a6_4c54_908d_39d1511da50b.slice/crio-b2ca9aec2596c246240aa5a6e603cc06199b5ecd40d56bba26b68d9a5bf760e3 WatchSource:0}: Error finding container b2ca9aec2596c246240aa5a6e603cc06199b5ecd40d56bba26b68d9a5bf760e3: Status 404 returned error can't find the container with id b2ca9aec2596c246240aa5a6e603cc06199b5ecd40d56bba26b68d9a5bf760e3 Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.330627 4839 generic.go:334] "Generic (PLEG): container finished" podID="6e1d0e8c-00aa-4770-9e58-b8f706d80a35" containerID="e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182" exitCode=0 Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.330753 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6e1d0e8c-00aa-4770-9e58-b8f706d80a35","Type":"ContainerDied","Data":"e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182"} Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.337637 4839 generic.go:334] "Generic (PLEG): container finished" podID="8028561c-b039-4400-a065-b5efee753b5f" containerID="fcd7e300ab111a88b888a2fc68f007c49d0404de0648aa1177c5d04bb341e74c" exitCode=0 Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.337717 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8028561c-b039-4400-a065-b5efee753b5f","Type":"ContainerDied","Data":"fcd7e300ab111a88b888a2fc68f007c49d0404de0648aa1177c5d04bb341e74c"} Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.340457 4839 generic.go:334] "Generic (PLEG): container finished" podID="8ad3cc08-174a-4164-aa38-3d7f6fbed0c0" containerID="435dd7b699a596fb94e68ae9d7689a76011012e1ee2be4e567ac5a478d536eb6" exitCode=0 Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.340622 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-852d-account-create-update-nv5n7" event={"ID":"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0","Type":"ContainerDied","Data":"435dd7b699a596fb94e68ae9d7689a76011012e1ee2be4e567ac5a478d536eb6"} Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.344533 4839 generic.go:334] "Generic (PLEG): container finished" podID="59ce13a7-d2a6-4c54-908d-39d1511da50b" containerID="e304597468db9fac443bca06d530c54513659f708975616b601e678f7766dbe4" exitCode=0 Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.344618 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1eec-account-create-update-h7hp7" event={"ID":"59ce13a7-d2a6-4c54-908d-39d1511da50b","Type":"ContainerDied","Data":"e304597468db9fac443bca06d530c54513659f708975616b601e678f7766dbe4"} Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.344649 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1eec-account-create-update-h7hp7" event={"ID":"59ce13a7-d2a6-4c54-908d-39d1511da50b","Type":"ContainerStarted","Data":"b2ca9aec2596c246240aa5a6e603cc06199b5ecd40d56bba26b68d9a5bf760e3"} Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.352459 4839 generic.go:334] "Generic (PLEG): container finished" podID="e779c2ff-ee70-4779-b3fc-3b3bf87aff47" containerID="048e9e28ac07a1e9124d69a89e17059f1d443023c6faf3348223cf9a7387e352" exitCode=0 Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.352524 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-v4k9c" event={"ID":"e779c2ff-ee70-4779-b3fc-3b3bf87aff47","Type":"ContainerDied","Data":"048e9e28ac07a1e9124d69a89e17059f1d443023c6faf3348223cf9a7387e352"} Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.356520 4839 generic.go:334] "Generic (PLEG): container finished" podID="c9a6840a-2ece-4b8d-be60-caa89912db9f" containerID="e6c4b76ad2e0d2ae69413d1d9b61feffab9768c0ce8180f11f0b591ba10e6f2c" exitCode=0 Mar 21 04:43:18 crc kubenswrapper[4839]: I0321 04:43:18.356598 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tnx95" event={"ID":"c9a6840a-2ece-4b8d-be60-caa89912db9f","Type":"ContainerDied","Data":"e6c4b76ad2e0d2ae69413d1d9b61feffab9768c0ce8180f11f0b591ba10e6f2c"} Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.388116 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6e1d0e8c-00aa-4770-9e58-b8f706d80a35","Type":"ContainerStarted","Data":"7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318"} Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.389097 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.394855 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8028561c-b039-4400-a065-b5efee753b5f","Type":"ContainerStarted","Data":"804d2b77429b6dcf4164535d9f43dee6f0cff10defca7a0d78be2b02039b8f92"} Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.395125 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.442834 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=55.608994074 podStartE2EDuration="1m9.442651093s" podCreationTimestamp="2026-03-21 04:42:10 +0000 UTC" firstStartedPulling="2026-03-21 04:42:22.722289259 +0000 UTC m=+1147.050075935" lastFinishedPulling="2026-03-21 04:42:36.555946278 +0000 UTC m=+1160.883732954" observedRunningTime="2026-03-21 04:43:19.416696987 +0000 UTC m=+1203.744483683" watchObservedRunningTime="2026-03-21 04:43:19.442651093 +0000 UTC m=+1203.770437789" Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.470977 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=56.658188445 podStartE2EDuration="1m9.470955065s" podCreationTimestamp="2026-03-21 04:42:10 +0000 UTC" firstStartedPulling="2026-03-21 04:42:22.742386071 +0000 UTC m=+1147.070172747" lastFinishedPulling="2026-03-21 04:42:35.555152691 +0000 UTC m=+1159.882939367" observedRunningTime="2026-03-21 04:43:19.459308949 +0000 UTC m=+1203.787095655" watchObservedRunningTime="2026-03-21 04:43:19.470955065 +0000 UTC m=+1203.798741741" Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.844840 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1eec-account-create-update-h7hp7" Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.855546 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ce13a7-d2a6-4c54-908d-39d1511da50b-operator-scripts\") pod \"59ce13a7-d2a6-4c54-908d-39d1511da50b\" (UID: \"59ce13a7-d2a6-4c54-908d-39d1511da50b\") " Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.855701 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frcpm\" (UniqueName: \"kubernetes.io/projected/59ce13a7-d2a6-4c54-908d-39d1511da50b-kube-api-access-frcpm\") pod \"59ce13a7-d2a6-4c54-908d-39d1511da50b\" (UID: \"59ce13a7-d2a6-4c54-908d-39d1511da50b\") " Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.856923 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59ce13a7-d2a6-4c54-908d-39d1511da50b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "59ce13a7-d2a6-4c54-908d-39d1511da50b" (UID: "59ce13a7-d2a6-4c54-908d-39d1511da50b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.872157 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59ce13a7-d2a6-4c54-908d-39d1511da50b-kube-api-access-frcpm" (OuterVolumeSpecName: "kube-api-access-frcpm") pod "59ce13a7-d2a6-4c54-908d-39d1511da50b" (UID: "59ce13a7-d2a6-4c54-908d-39d1511da50b"). InnerVolumeSpecName "kube-api-access-frcpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.959687 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ce13a7-d2a6-4c54-908d-39d1511da50b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:19 crc kubenswrapper[4839]: I0321 04:43:19.959727 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frcpm\" (UniqueName: \"kubernetes.io/projected/59ce13a7-d2a6-4c54-908d-39d1511da50b-kube-api-access-frcpm\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.032419 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-v4k9c" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.041193 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tnx95" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.049830 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-852d-account-create-update-nv5n7" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.060463 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-operator-scripts\") pod \"e779c2ff-ee70-4779-b3fc-3b3bf87aff47\" (UID: \"e779c2ff-ee70-4779-b3fc-3b3bf87aff47\") " Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.060562 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49pjl\" (UniqueName: \"kubernetes.io/projected/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-kube-api-access-49pjl\") pod \"e779c2ff-ee70-4779-b3fc-3b3bf87aff47\" (UID: \"e779c2ff-ee70-4779-b3fc-3b3bf87aff47\") " Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.060612 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm69x\" (UniqueName: \"kubernetes.io/projected/c9a6840a-2ece-4b8d-be60-caa89912db9f-kube-api-access-fm69x\") pod \"c9a6840a-2ece-4b8d-be60-caa89912db9f\" (UID: \"c9a6840a-2ece-4b8d-be60-caa89912db9f\") " Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.060666 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-operator-scripts\") pod \"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0\" (UID: \"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0\") " Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.060779 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a6840a-2ece-4b8d-be60-caa89912db9f-operator-scripts\") pod \"c9a6840a-2ece-4b8d-be60-caa89912db9f\" (UID: \"c9a6840a-2ece-4b8d-be60-caa89912db9f\") " Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.060805 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnt44\" (UniqueName: \"kubernetes.io/projected/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-kube-api-access-wnt44\") pod \"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0\" (UID: \"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0\") " Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.061104 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e779c2ff-ee70-4779-b3fc-3b3bf87aff47" (UID: "e779c2ff-ee70-4779-b3fc-3b3bf87aff47"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.061357 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ad3cc08-174a-4164-aa38-3d7f6fbed0c0" (UID: "8ad3cc08-174a-4164-aa38-3d7f6fbed0c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.061599 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9a6840a-2ece-4b8d-be60-caa89912db9f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c9a6840a-2ece-4b8d-be60-caa89912db9f" (UID: "c9a6840a-2ece-4b8d-be60-caa89912db9f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.064449 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a6840a-2ece-4b8d-be60-caa89912db9f-kube-api-access-fm69x" (OuterVolumeSpecName: "kube-api-access-fm69x") pod "c9a6840a-2ece-4b8d-be60-caa89912db9f" (UID: "c9a6840a-2ece-4b8d-be60-caa89912db9f"). InnerVolumeSpecName "kube-api-access-fm69x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.065845 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-kube-api-access-49pjl" (OuterVolumeSpecName: "kube-api-access-49pjl") pod "e779c2ff-ee70-4779-b3fc-3b3bf87aff47" (UID: "e779c2ff-ee70-4779-b3fc-3b3bf87aff47"). InnerVolumeSpecName "kube-api-access-49pjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.068220 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-kube-api-access-wnt44" (OuterVolumeSpecName: "kube-api-access-wnt44") pod "8ad3cc08-174a-4164-aa38-3d7f6fbed0c0" (UID: "8ad3cc08-174a-4164-aa38-3d7f6fbed0c0"). InnerVolumeSpecName "kube-api-access-wnt44". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.163275 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.163546 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49pjl\" (UniqueName: \"kubernetes.io/projected/e779c2ff-ee70-4779-b3fc-3b3bf87aff47-kube-api-access-49pjl\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.163663 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fm69x\" (UniqueName: \"kubernetes.io/projected/c9a6840a-2ece-4b8d-be60-caa89912db9f-kube-api-access-fm69x\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.163745 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.163836 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9a6840a-2ece-4b8d-be60-caa89912db9f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.163951 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnt44\" (UniqueName: \"kubernetes.io/projected/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0-kube-api-access-wnt44\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.217817 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-vqfbm"] Mar 21 04:43:20 crc kubenswrapper[4839]: E0321 04:43:20.218461 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e779c2ff-ee70-4779-b3fc-3b3bf87aff47" containerName="mariadb-database-create" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.218528 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e779c2ff-ee70-4779-b3fc-3b3bf87aff47" containerName="mariadb-database-create" Mar 21 04:43:20 crc kubenswrapper[4839]: E0321 04:43:20.218617 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59ce13a7-d2a6-4c54-908d-39d1511da50b" containerName="mariadb-account-create-update" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.218687 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ce13a7-d2a6-4c54-908d-39d1511da50b" containerName="mariadb-account-create-update" Mar 21 04:43:20 crc kubenswrapper[4839]: E0321 04:43:20.218753 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad3cc08-174a-4164-aa38-3d7f6fbed0c0" containerName="mariadb-account-create-update" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.218805 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad3cc08-174a-4164-aa38-3d7f6fbed0c0" containerName="mariadb-account-create-update" Mar 21 04:43:20 crc kubenswrapper[4839]: E0321 04:43:20.218863 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a6840a-2ece-4b8d-be60-caa89912db9f" containerName="mariadb-database-create" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.218918 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a6840a-2ece-4b8d-be60-caa89912db9f" containerName="mariadb-database-create" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.219139 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ad3cc08-174a-4164-aa38-3d7f6fbed0c0" containerName="mariadb-account-create-update" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.219503 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9a6840a-2ece-4b8d-be60-caa89912db9f" containerName="mariadb-database-create" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.219594 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="59ce13a7-d2a6-4c54-908d-39d1511da50b" containerName="mariadb-account-create-update" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.219677 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e779c2ff-ee70-4779-b3fc-3b3bf87aff47" containerName="mariadb-database-create" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.220320 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vqfbm" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.228426 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-vqfbm"] Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.265988 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-operator-scripts\") pod \"glance-db-create-vqfbm\" (UID: \"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754\") " pod="openstack/glance-db-create-vqfbm" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.266039 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9qph\" (UniqueName: \"kubernetes.io/projected/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-kube-api-access-q9qph\") pod \"glance-db-create-vqfbm\" (UID: \"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754\") " pod="openstack/glance-db-create-vqfbm" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.332084 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-31f4-account-create-update-98c9m"] Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.333119 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-31f4-account-create-update-98c9m" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.335281 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.344043 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-31f4-account-create-update-98c9m"] Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.367578 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5740bec9-4b0c-4092-8309-14fdb2562c2e-operator-scripts\") pod \"glance-31f4-account-create-update-98c9m\" (UID: \"5740bec9-4b0c-4092-8309-14fdb2562c2e\") " pod="openstack/glance-31f4-account-create-update-98c9m" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.367940 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpmp5\" (UniqueName: \"kubernetes.io/projected/5740bec9-4b0c-4092-8309-14fdb2562c2e-kube-api-access-qpmp5\") pod \"glance-31f4-account-create-update-98c9m\" (UID: \"5740bec9-4b0c-4092-8309-14fdb2562c2e\") " pod="openstack/glance-31f4-account-create-update-98c9m" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.368140 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-operator-scripts\") pod \"glance-db-create-vqfbm\" (UID: \"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754\") " pod="openstack/glance-db-create-vqfbm" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.368250 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9qph\" (UniqueName: \"kubernetes.io/projected/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-kube-api-access-q9qph\") pod \"glance-db-create-vqfbm\" (UID: \"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754\") " pod="openstack/glance-db-create-vqfbm" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.369034 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-operator-scripts\") pod \"glance-db-create-vqfbm\" (UID: \"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754\") " pod="openstack/glance-db-create-vqfbm" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.400048 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9qph\" (UniqueName: \"kubernetes.io/projected/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-kube-api-access-q9qph\") pod \"glance-db-create-vqfbm\" (UID: \"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754\") " pod="openstack/glance-db-create-vqfbm" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.403623 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-852d-account-create-update-nv5n7" event={"ID":"8ad3cc08-174a-4164-aa38-3d7f6fbed0c0","Type":"ContainerDied","Data":"fd06d64cd3e2f8af0c1e033552a48d6556973db424a6dde6732cb4af920ae9b3"} Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.403675 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd06d64cd3e2f8af0c1e033552a48d6556973db424a6dde6732cb4af920ae9b3" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.403763 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-852d-account-create-update-nv5n7" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.406431 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1eec-account-create-update-h7hp7" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.406441 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1eec-account-create-update-h7hp7" event={"ID":"59ce13a7-d2a6-4c54-908d-39d1511da50b","Type":"ContainerDied","Data":"b2ca9aec2596c246240aa5a6e603cc06199b5ecd40d56bba26b68d9a5bf760e3"} Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.406480 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2ca9aec2596c246240aa5a6e603cc06199b5ecd40d56bba26b68d9a5bf760e3" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.408005 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-v4k9c" event={"ID":"e779c2ff-ee70-4779-b3fc-3b3bf87aff47","Type":"ContainerDied","Data":"97bb4acf95a9a42df523280643f99bc7040e0d83179344bcc8f457ae232c2302"} Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.408131 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97bb4acf95a9a42df523280643f99bc7040e0d83179344bcc8f457ae232c2302" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.408252 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-v4k9c" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.413130 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tnx95" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.414659 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tnx95" event={"ID":"c9a6840a-2ece-4b8d-be60-caa89912db9f","Type":"ContainerDied","Data":"978b69facff26547540b897eeb4f977dfaff5f241499331330d6b7477a899fcb"} Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.414729 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="978b69facff26547540b897eeb4f977dfaff5f241499331330d6b7477a899fcb" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.470399 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5740bec9-4b0c-4092-8309-14fdb2562c2e-operator-scripts\") pod \"glance-31f4-account-create-update-98c9m\" (UID: \"5740bec9-4b0c-4092-8309-14fdb2562c2e\") " pod="openstack/glance-31f4-account-create-update-98c9m" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.470522 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpmp5\" (UniqueName: \"kubernetes.io/projected/5740bec9-4b0c-4092-8309-14fdb2562c2e-kube-api-access-qpmp5\") pod \"glance-31f4-account-create-update-98c9m\" (UID: \"5740bec9-4b0c-4092-8309-14fdb2562c2e\") " pod="openstack/glance-31f4-account-create-update-98c9m" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.471131 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5740bec9-4b0c-4092-8309-14fdb2562c2e-operator-scripts\") pod \"glance-31f4-account-create-update-98c9m\" (UID: \"5740bec9-4b0c-4092-8309-14fdb2562c2e\") " pod="openstack/glance-31f4-account-create-update-98c9m" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.488728 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpmp5\" (UniqueName: \"kubernetes.io/projected/5740bec9-4b0c-4092-8309-14fdb2562c2e-kube-api-access-qpmp5\") pod \"glance-31f4-account-create-update-98c9m\" (UID: \"5740bec9-4b0c-4092-8309-14fdb2562c2e\") " pod="openstack/glance-31f4-account-create-update-98c9m" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.548946 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vqfbm" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.649302 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-31f4-account-create-update-98c9m" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.852901 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.858536 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-qt5s4" podUID="b31b64cb-0266-4b8a-9fcb-ae5e36c8309a" containerName="ovn-controller" probeResult="failure" output=< Mar 21 04:43:20 crc kubenswrapper[4839]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 21 04:43:20 crc kubenswrapper[4839]: > Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.948104 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hrww8" Mar 21 04:43:20 crc kubenswrapper[4839]: I0321 04:43:20.985246 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-vqfbm"] Mar 21 04:43:21 crc kubenswrapper[4839]: W0321 04:43:21.004860 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb46c59d5_1b87_471e_ae9b_b8ba7ca8d754.slice/crio-91da4146a7f72f81244c5bf66899f5aeb8fe1d05e895e4c0f447f26610ae10a8 WatchSource:0}: Error finding container 91da4146a7f72f81244c5bf66899f5aeb8fe1d05e895e4c0f447f26610ae10a8: Status 404 returned error can't find the container with id 91da4146a7f72f81244c5bf66899f5aeb8fe1d05e895e4c0f447f26610ae10a8 Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.143877 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-31f4-account-create-update-98c9m"] Mar 21 04:43:21 crc kubenswrapper[4839]: W0321 04:43:21.148713 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5740bec9_4b0c_4092_8309_14fdb2562c2e.slice/crio-90280b9cf99b4278e51fbecf4f72a0d75b08edcce54fb92a99812a0178e1ae70 WatchSource:0}: Error finding container 90280b9cf99b4278e51fbecf4f72a0d75b08edcce54fb92a99812a0178e1ae70: Status 404 returned error can't find the container with id 90280b9cf99b4278e51fbecf4f72a0d75b08edcce54fb92a99812a0178e1ae70 Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.166953 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qt5s4-config-llx9j"] Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.174092 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.179063 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qt5s4-config-llx9j"] Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.180778 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.183852 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.183904 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-scripts\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.183930 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-additional-scripts\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.183992 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bccfm\" (UniqueName: \"kubernetes.io/projected/3fb143a8-0cf7-4e83-8e23-aa49453bac07-kube-api-access-bccfm\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.184010 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run-ovn\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.184033 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-log-ovn\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.285836 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bccfm\" (UniqueName: \"kubernetes.io/projected/3fb143a8-0cf7-4e83-8e23-aa49453bac07-kube-api-access-bccfm\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.285883 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run-ovn\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.285923 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-log-ovn\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.285999 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.286058 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-scripts\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.286088 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-additional-scripts\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.286333 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run-ovn\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.286360 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-log-ovn\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.286408 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.286922 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-additional-scripts\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.288245 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-scripts\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.307644 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bccfm\" (UniqueName: \"kubernetes.io/projected/3fb143a8-0cf7-4e83-8e23-aa49453bac07-kube-api-access-bccfm\") pod \"ovn-controller-qt5s4-config-llx9j\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.419508 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vqfbm" event={"ID":"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754","Type":"ContainerStarted","Data":"18ee77a1f0c351aba88f15dc3bad4a37015f55b27e92d2f7b43fdfe709bc67ef"} Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.419562 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vqfbm" event={"ID":"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754","Type":"ContainerStarted","Data":"91da4146a7f72f81244c5bf66899f5aeb8fe1d05e895e4c0f447f26610ae10a8"} Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.422054 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-31f4-account-create-update-98c9m" event={"ID":"5740bec9-4b0c-4092-8309-14fdb2562c2e","Type":"ContainerStarted","Data":"bc85e819a8b1f2def449cfd0987dc3cf3c1805c923545af4ec58edeba1a10775"} Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.422187 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-31f4-account-create-update-98c9m" event={"ID":"5740bec9-4b0c-4092-8309-14fdb2562c2e","Type":"ContainerStarted","Data":"90280b9cf99b4278e51fbecf4f72a0d75b08edcce54fb92a99812a0178e1ae70"} Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.441709 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-vqfbm" podStartSLOduration=1.441687746 podStartE2EDuration="1.441687746s" podCreationTimestamp="2026-03-21 04:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:43:21.437125939 +0000 UTC m=+1205.764912625" watchObservedRunningTime="2026-03-21 04:43:21.441687746 +0000 UTC m=+1205.769474422" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.456423 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-31f4-account-create-update-98c9m" podStartSLOduration=1.456406278 podStartE2EDuration="1.456406278s" podCreationTimestamp="2026-03-21 04:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:43:21.455601115 +0000 UTC m=+1205.783387791" watchObservedRunningTime="2026-03-21 04:43:21.456406278 +0000 UTC m=+1205.784192954" Mar 21 04:43:21 crc kubenswrapper[4839]: I0321 04:43:21.556366 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.035720 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qt5s4-config-llx9j"] Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.170704 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-fgh6w"] Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.171703 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fgh6w" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.173640 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.188801 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fgh6w"] Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.302237 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmcrx\" (UniqueName: \"kubernetes.io/projected/0fc450b4-4ccf-4e6e-97d1-d47f252be788-kube-api-access-hmcrx\") pod \"root-account-create-update-fgh6w\" (UID: \"0fc450b4-4ccf-4e6e-97d1-d47f252be788\") " pod="openstack/root-account-create-update-fgh6w" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.302385 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fc450b4-4ccf-4e6e-97d1-d47f252be788-operator-scripts\") pod \"root-account-create-update-fgh6w\" (UID: \"0fc450b4-4ccf-4e6e-97d1-d47f252be788\") " pod="openstack/root-account-create-update-fgh6w" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.403591 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmcrx\" (UniqueName: \"kubernetes.io/projected/0fc450b4-4ccf-4e6e-97d1-d47f252be788-kube-api-access-hmcrx\") pod \"root-account-create-update-fgh6w\" (UID: \"0fc450b4-4ccf-4e6e-97d1-d47f252be788\") " pod="openstack/root-account-create-update-fgh6w" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.403667 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fc450b4-4ccf-4e6e-97d1-d47f252be788-operator-scripts\") pod \"root-account-create-update-fgh6w\" (UID: \"0fc450b4-4ccf-4e6e-97d1-d47f252be788\") " pod="openstack/root-account-create-update-fgh6w" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.404525 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fc450b4-4ccf-4e6e-97d1-d47f252be788-operator-scripts\") pod \"root-account-create-update-fgh6w\" (UID: \"0fc450b4-4ccf-4e6e-97d1-d47f252be788\") " pod="openstack/root-account-create-update-fgh6w" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.423689 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmcrx\" (UniqueName: \"kubernetes.io/projected/0fc450b4-4ccf-4e6e-97d1-d47f252be788-kube-api-access-hmcrx\") pod \"root-account-create-update-fgh6w\" (UID: \"0fc450b4-4ccf-4e6e-97d1-d47f252be788\") " pod="openstack/root-account-create-update-fgh6w" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.444117 4839 generic.go:334] "Generic (PLEG): container finished" podID="b46c59d5-1b87-471e-ae9b-b8ba7ca8d754" containerID="18ee77a1f0c351aba88f15dc3bad4a37015f55b27e92d2f7b43fdfe709bc67ef" exitCode=0 Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.444185 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vqfbm" event={"ID":"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754","Type":"ContainerDied","Data":"18ee77a1f0c351aba88f15dc3bad4a37015f55b27e92d2f7b43fdfe709bc67ef"} Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.446770 4839 generic.go:334] "Generic (PLEG): container finished" podID="5484abbf-53f2-445a-b6fe-0996eba95345" containerID="93b552b909df831d40a3b2b56c1f6ba5babeea45e19a365649677dff6a5a3b56" exitCode=0 Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.446825 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kkvzq" event={"ID":"5484abbf-53f2-445a-b6fe-0996eba95345","Type":"ContainerDied","Data":"93b552b909df831d40a3b2b56c1f6ba5babeea45e19a365649677dff6a5a3b56"} Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.448727 4839 generic.go:334] "Generic (PLEG): container finished" podID="5740bec9-4b0c-4092-8309-14fdb2562c2e" containerID="bc85e819a8b1f2def449cfd0987dc3cf3c1805c923545af4ec58edeba1a10775" exitCode=0 Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.448874 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-31f4-account-create-update-98c9m" event={"ID":"5740bec9-4b0c-4092-8309-14fdb2562c2e","Type":"ContainerDied","Data":"bc85e819a8b1f2def449cfd0987dc3cf3c1805c923545af4ec58edeba1a10775"} Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.472103 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qt5s4-config-llx9j" event={"ID":"3fb143a8-0cf7-4e83-8e23-aa49453bac07","Type":"ContainerStarted","Data":"d792580397713b6021c551a3f4cbfaf97f1c5637484d37b25e33338bf6fc4ac7"} Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.472161 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qt5s4-config-llx9j" event={"ID":"3fb143a8-0cf7-4e83-8e23-aa49453bac07","Type":"ContainerStarted","Data":"c4744e1d4932aeabfd76efbfcdb4f01e6c1633f7bb3d9b33a5ac60558694c4bb"} Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.495332 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fgh6w" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.514601 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-qt5s4-config-llx9j" podStartSLOduration=1.5145837100000001 podStartE2EDuration="1.51458371s" podCreationTimestamp="2026-03-21 04:43:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:43:22.507611485 +0000 UTC m=+1206.835398181" watchObservedRunningTime="2026-03-21 04:43:22.51458371 +0000 UTC m=+1206.842370386" Mar 21 04:43:22 crc kubenswrapper[4839]: I0321 04:43:22.982837 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fgh6w"] Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.476303 4839 generic.go:334] "Generic (PLEG): container finished" podID="0fc450b4-4ccf-4e6e-97d1-d47f252be788" containerID="6b5e6693316b5cfa06c0f6c8e7e9f37a0398c873966489b56f00cbd44f60fd16" exitCode=0 Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.477052 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fgh6w" event={"ID":"0fc450b4-4ccf-4e6e-97d1-d47f252be788","Type":"ContainerDied","Data":"6b5e6693316b5cfa06c0f6c8e7e9f37a0398c873966489b56f00cbd44f60fd16"} Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.477240 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fgh6w" event={"ID":"0fc450b4-4ccf-4e6e-97d1-d47f252be788","Type":"ContainerStarted","Data":"2bb3f1fdeb6e88bb5e653c065ef6cd3c419af258dfd00a1bd1da4d47d7d7f880"} Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.484695 4839 generic.go:334] "Generic (PLEG): container finished" podID="3fb143a8-0cf7-4e83-8e23-aa49453bac07" containerID="d792580397713b6021c551a3f4cbfaf97f1c5637484d37b25e33338bf6fc4ac7" exitCode=0 Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.484733 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qt5s4-config-llx9j" event={"ID":"3fb143a8-0cf7-4e83-8e23-aa49453bac07","Type":"ContainerDied","Data":"d792580397713b6021c551a3f4cbfaf97f1c5637484d37b25e33338bf6fc4ac7"} Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.916985 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.964538 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-dispersionconf\") pod \"5484abbf-53f2-445a-b6fe-0996eba95345\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.964894 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-scripts\") pod \"5484abbf-53f2-445a-b6fe-0996eba95345\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.965084 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-combined-ca-bundle\") pod \"5484abbf-53f2-445a-b6fe-0996eba95345\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.965783 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4552p\" (UniqueName: \"kubernetes.io/projected/5484abbf-53f2-445a-b6fe-0996eba95345-kube-api-access-4552p\") pod \"5484abbf-53f2-445a-b6fe-0996eba95345\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.965954 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-ring-data-devices\") pod \"5484abbf-53f2-445a-b6fe-0996eba95345\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.966085 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-swiftconf\") pod \"5484abbf-53f2-445a-b6fe-0996eba95345\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.966219 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5484abbf-53f2-445a-b6fe-0996eba95345-etc-swift\") pod \"5484abbf-53f2-445a-b6fe-0996eba95345\" (UID: \"5484abbf-53f2-445a-b6fe-0996eba95345\") " Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.968900 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5484abbf-53f2-445a-b6fe-0996eba95345-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5484abbf-53f2-445a-b6fe-0996eba95345" (UID: "5484abbf-53f2-445a-b6fe-0996eba95345"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.969518 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5484abbf-53f2-445a-b6fe-0996eba95345" (UID: "5484abbf-53f2-445a-b6fe-0996eba95345"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.970661 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5484abbf-53f2-445a-b6fe-0996eba95345-kube-api-access-4552p" (OuterVolumeSpecName: "kube-api-access-4552p") pod "5484abbf-53f2-445a-b6fe-0996eba95345" (UID: "5484abbf-53f2-445a-b6fe-0996eba95345"). InnerVolumeSpecName "kube-api-access-4552p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.978803 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5484abbf-53f2-445a-b6fe-0996eba95345" (UID: "5484abbf-53f2-445a-b6fe-0996eba95345"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:43:23 crc kubenswrapper[4839]: I0321 04:43:23.999340 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-scripts" (OuterVolumeSpecName: "scripts") pod "5484abbf-53f2-445a-b6fe-0996eba95345" (UID: "5484abbf-53f2-445a-b6fe-0996eba95345"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.008937 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5484abbf-53f2-445a-b6fe-0996eba95345" (UID: "5484abbf-53f2-445a-b6fe-0996eba95345"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.015440 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5484abbf-53f2-445a-b6fe-0996eba95345" (UID: "5484abbf-53f2-445a-b6fe-0996eba95345"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.069086 4839 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.069139 4839 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5484abbf-53f2-445a-b6fe-0996eba95345-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.069161 4839 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.069177 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.069191 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5484abbf-53f2-445a-b6fe-0996eba95345-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.069204 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4552p\" (UniqueName: \"kubernetes.io/projected/5484abbf-53f2-445a-b6fe-0996eba95345-kube-api-access-4552p\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.069216 4839 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5484abbf-53f2-445a-b6fe-0996eba95345-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.105083 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-31f4-account-create-update-98c9m" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.117596 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vqfbm" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.169881 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9qph\" (UniqueName: \"kubernetes.io/projected/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-kube-api-access-q9qph\") pod \"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754\" (UID: \"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754\") " Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.170015 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpmp5\" (UniqueName: \"kubernetes.io/projected/5740bec9-4b0c-4092-8309-14fdb2562c2e-kube-api-access-qpmp5\") pod \"5740bec9-4b0c-4092-8309-14fdb2562c2e\" (UID: \"5740bec9-4b0c-4092-8309-14fdb2562c2e\") " Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.170156 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5740bec9-4b0c-4092-8309-14fdb2562c2e-operator-scripts\") pod \"5740bec9-4b0c-4092-8309-14fdb2562c2e\" (UID: \"5740bec9-4b0c-4092-8309-14fdb2562c2e\") " Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.170207 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-operator-scripts\") pod \"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754\" (UID: \"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754\") " Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.171210 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5740bec9-4b0c-4092-8309-14fdb2562c2e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5740bec9-4b0c-4092-8309-14fdb2562c2e" (UID: "5740bec9-4b0c-4092-8309-14fdb2562c2e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.171376 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b46c59d5-1b87-471e-ae9b-b8ba7ca8d754" (UID: "b46c59d5-1b87-471e-ae9b-b8ba7ca8d754"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.173869 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-kube-api-access-q9qph" (OuterVolumeSpecName: "kube-api-access-q9qph") pod "b46c59d5-1b87-471e-ae9b-b8ba7ca8d754" (UID: "b46c59d5-1b87-471e-ae9b-b8ba7ca8d754"). InnerVolumeSpecName "kube-api-access-q9qph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.177074 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5740bec9-4b0c-4092-8309-14fdb2562c2e-kube-api-access-qpmp5" (OuterVolumeSpecName: "kube-api-access-qpmp5") pod "5740bec9-4b0c-4092-8309-14fdb2562c2e" (UID: "5740bec9-4b0c-4092-8309-14fdb2562c2e"). InnerVolumeSpecName "kube-api-access-qpmp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.272297 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpmp5\" (UniqueName: \"kubernetes.io/projected/5740bec9-4b0c-4092-8309-14fdb2562c2e-kube-api-access-qpmp5\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.272604 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5740bec9-4b0c-4092-8309-14fdb2562c2e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.272622 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.272635 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9qph\" (UniqueName: \"kubernetes.io/projected/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754-kube-api-access-q9qph\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.495260 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-kkvzq" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.495241 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-kkvzq" event={"ID":"5484abbf-53f2-445a-b6fe-0996eba95345","Type":"ContainerDied","Data":"609cfec14cc280f7aa99193851e53d3aac7200e8836be9f73b80cc6653f3fdc5"} Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.497967 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="609cfec14cc280f7aa99193851e53d3aac7200e8836be9f73b80cc6653f3fdc5" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.504474 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-31f4-account-create-update-98c9m" event={"ID":"5740bec9-4b0c-4092-8309-14fdb2562c2e","Type":"ContainerDied","Data":"90280b9cf99b4278e51fbecf4f72a0d75b08edcce54fb92a99812a0178e1ae70"} Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.504765 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90280b9cf99b4278e51fbecf4f72a0d75b08edcce54fb92a99812a0178e1ae70" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.504793 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-31f4-account-create-update-98c9m" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.506894 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vqfbm" event={"ID":"b46c59d5-1b87-471e-ae9b-b8ba7ca8d754","Type":"ContainerDied","Data":"91da4146a7f72f81244c5bf66899f5aeb8fe1d05e895e4c0f447f26610ae10a8"} Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.506926 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91da4146a7f72f81244c5bf66899f5aeb8fe1d05e895e4c0f447f26610ae10a8" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.507211 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vqfbm" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.978235 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fgh6w" Mar 21 04:43:24 crc kubenswrapper[4839]: I0321 04:43:24.984662 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125133 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-log-ovn\") pod \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125215 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "3fb143a8-0cf7-4e83-8e23-aa49453bac07" (UID: "3fb143a8-0cf7-4e83-8e23-aa49453bac07"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125228 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fc450b4-4ccf-4e6e-97d1-d47f252be788-operator-scripts\") pod \"0fc450b4-4ccf-4e6e-97d1-d47f252be788\" (UID: \"0fc450b4-4ccf-4e6e-97d1-d47f252be788\") " Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125264 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-scripts\") pod \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125291 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run\") pod \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125324 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-additional-scripts\") pod \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125398 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run-ovn\") pod \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125426 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmcrx\" (UniqueName: \"kubernetes.io/projected/0fc450b4-4ccf-4e6e-97d1-d47f252be788-kube-api-access-hmcrx\") pod \"0fc450b4-4ccf-4e6e-97d1-d47f252be788\" (UID: \"0fc450b4-4ccf-4e6e-97d1-d47f252be788\") " Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125552 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bccfm\" (UniqueName: \"kubernetes.io/projected/3fb143a8-0cf7-4e83-8e23-aa49453bac07-kube-api-access-bccfm\") pod \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\" (UID: \"3fb143a8-0cf7-4e83-8e23-aa49453bac07\") " Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125743 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fc450b4-4ccf-4e6e-97d1-d47f252be788-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0fc450b4-4ccf-4e6e-97d1-d47f252be788" (UID: "0fc450b4-4ccf-4e6e-97d1-d47f252be788"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125882 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "3fb143a8-0cf7-4e83-8e23-aa49453bac07" (UID: "3fb143a8-0cf7-4e83-8e23-aa49453bac07"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.125951 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run" (OuterVolumeSpecName: "var-run") pod "3fb143a8-0cf7-4e83-8e23-aa49453bac07" (UID: "3fb143a8-0cf7-4e83-8e23-aa49453bac07"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.126313 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fc450b4-4ccf-4e6e-97d1-d47f252be788-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.126329 4839 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.126338 4839 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.126346 4839 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3fb143a8-0cf7-4e83-8e23-aa49453bac07-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.126393 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "3fb143a8-0cf7-4e83-8e23-aa49453bac07" (UID: "3fb143a8-0cf7-4e83-8e23-aa49453bac07"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.126742 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-scripts" (OuterVolumeSpecName: "scripts") pod "3fb143a8-0cf7-4e83-8e23-aa49453bac07" (UID: "3fb143a8-0cf7-4e83-8e23-aa49453bac07"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.130547 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fc450b4-4ccf-4e6e-97d1-d47f252be788-kube-api-access-hmcrx" (OuterVolumeSpecName: "kube-api-access-hmcrx") pod "0fc450b4-4ccf-4e6e-97d1-d47f252be788" (UID: "0fc450b4-4ccf-4e6e-97d1-d47f252be788"). InnerVolumeSpecName "kube-api-access-hmcrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.147932 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fb143a8-0cf7-4e83-8e23-aa49453bac07-kube-api-access-bccfm" (OuterVolumeSpecName: "kube-api-access-bccfm") pod "3fb143a8-0cf7-4e83-8e23-aa49453bac07" (UID: "3fb143a8-0cf7-4e83-8e23-aa49453bac07"). InnerVolumeSpecName "kube-api-access-bccfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.228129 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bccfm\" (UniqueName: \"kubernetes.io/projected/3fb143a8-0cf7-4e83-8e23-aa49453bac07-kube-api-access-bccfm\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.228165 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.228174 4839 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3fb143a8-0cf7-4e83-8e23-aa49453bac07-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.228185 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmcrx\" (UniqueName: \"kubernetes.io/projected/0fc450b4-4ccf-4e6e-97d1-d47f252be788-kube-api-access-hmcrx\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.515562 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fgh6w" event={"ID":"0fc450b4-4ccf-4e6e-97d1-d47f252be788","Type":"ContainerDied","Data":"2bb3f1fdeb6e88bb5e653c065ef6cd3c419af258dfd00a1bd1da4d47d7d7f880"} Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.515604 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fgh6w" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.515617 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bb3f1fdeb6e88bb5e653c065ef6cd3c419af258dfd00a1bd1da4d47d7d7f880" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.517326 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qt5s4-config-llx9j" event={"ID":"3fb143a8-0cf7-4e83-8e23-aa49453bac07","Type":"ContainerDied","Data":"c4744e1d4932aeabfd76efbfcdb4f01e6c1633f7bb3d9b33a5ac60558694c4bb"} Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.517372 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qt5s4-config-llx9j" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.517383 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4744e1d4932aeabfd76efbfcdb4f01e6c1633f7bb3d9b33a5ac60558694c4bb" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.577387 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-ng2tw"] Mar 21 04:43:25 crc kubenswrapper[4839]: E0321 04:43:25.577794 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5740bec9-4b0c-4092-8309-14fdb2562c2e" containerName="mariadb-account-create-update" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.577815 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5740bec9-4b0c-4092-8309-14fdb2562c2e" containerName="mariadb-account-create-update" Mar 21 04:43:25 crc kubenswrapper[4839]: E0321 04:43:25.577832 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5484abbf-53f2-445a-b6fe-0996eba95345" containerName="swift-ring-rebalance" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.577841 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5484abbf-53f2-445a-b6fe-0996eba95345" containerName="swift-ring-rebalance" Mar 21 04:43:25 crc kubenswrapper[4839]: E0321 04:43:25.577849 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fb143a8-0cf7-4e83-8e23-aa49453bac07" containerName="ovn-config" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.577857 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fb143a8-0cf7-4e83-8e23-aa49453bac07" containerName="ovn-config" Mar 21 04:43:25 crc kubenswrapper[4839]: E0321 04:43:25.577872 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b46c59d5-1b87-471e-ae9b-b8ba7ca8d754" containerName="mariadb-database-create" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.577880 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46c59d5-1b87-471e-ae9b-b8ba7ca8d754" containerName="mariadb-database-create" Mar 21 04:43:25 crc kubenswrapper[4839]: E0321 04:43:25.577896 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fc450b4-4ccf-4e6e-97d1-d47f252be788" containerName="mariadb-account-create-update" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.577905 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fc450b4-4ccf-4e6e-97d1-d47f252be788" containerName="mariadb-account-create-update" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.578100 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b46c59d5-1b87-471e-ae9b-b8ba7ca8d754" containerName="mariadb-database-create" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.578115 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5484abbf-53f2-445a-b6fe-0996eba95345" containerName="swift-ring-rebalance" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.578129 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5740bec9-4b0c-4092-8309-14fdb2562c2e" containerName="mariadb-account-create-update" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.578141 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fb143a8-0cf7-4e83-8e23-aa49453bac07" containerName="ovn-config" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.578151 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fc450b4-4ccf-4e6e-97d1-d47f252be788" containerName="mariadb-account-create-update" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.578810 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.580840 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-v5bc4" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.581331 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.595379 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ng2tw"] Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.634667 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8phmg\" (UniqueName: \"kubernetes.io/projected/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-kube-api-access-8phmg\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.634715 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-config-data\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.634790 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-db-sync-config-data\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.634839 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-combined-ca-bundle\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.641906 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-qt5s4-config-llx9j"] Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.648392 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-qt5s4-config-llx9j"] Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.736689 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-db-sync-config-data\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.736787 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-combined-ca-bundle\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.736871 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8phmg\" (UniqueName: \"kubernetes.io/projected/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-kube-api-access-8phmg\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.736899 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-config-data\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.741763 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-db-sync-config-data\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.742002 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-combined-ca-bundle\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.743056 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-config-data\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.751892 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qt5s4-config-zb6w6"] Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.754190 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.763200 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qt5s4-config-zb6w6"] Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.763526 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.767832 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8phmg\" (UniqueName: \"kubernetes.io/projected/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-kube-api-access-8phmg\") pod \"glance-db-sync-ng2tw\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.826933 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-qt5s4" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.838832 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.838894 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxxxk\" (UniqueName: \"kubernetes.io/projected/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-kube-api-access-wxxxk\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.838954 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-additional-scripts\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.839011 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-log-ovn\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.839028 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-scripts\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.839050 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run-ovn\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.906066 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ng2tw" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.940604 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.940669 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxxxk\" (UniqueName: \"kubernetes.io/projected/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-kube-api-access-wxxxk\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.940764 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-additional-scripts\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.940875 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-log-ovn\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.940899 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-scripts\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.940929 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run-ovn\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.941223 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-log-ovn\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.941476 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.941974 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-additional-scripts\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.943658 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-scripts\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.944370 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run-ovn\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:25 crc kubenswrapper[4839]: I0321 04:43:25.963118 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxxxk\" (UniqueName: \"kubernetes.io/projected/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-kube-api-access-wxxxk\") pod \"ovn-controller-qt5s4-config-zb6w6\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:26 crc kubenswrapper[4839]: I0321 04:43:26.085574 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:26 crc kubenswrapper[4839]: I0321 04:43:26.462661 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fb143a8-0cf7-4e83-8e23-aa49453bac07" path="/var/lib/kubelet/pods/3fb143a8-0cf7-4e83-8e23-aa49453bac07/volumes" Mar 21 04:43:26 crc kubenswrapper[4839]: I0321 04:43:26.491229 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ng2tw"] Mar 21 04:43:26 crc kubenswrapper[4839]: W0321 04:43:26.496992 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cc1dfb9_8108_46e5_8dc5_5b555590ecc1.slice/crio-65ed5290939376cc07f297d06efe7a7f9acbf33da55d639c2cc318b6e8be4b9e WatchSource:0}: Error finding container 65ed5290939376cc07f297d06efe7a7f9acbf33da55d639c2cc318b6e8be4b9e: Status 404 returned error can't find the container with id 65ed5290939376cc07f297d06efe7a7f9acbf33da55d639c2cc318b6e8be4b9e Mar 21 04:43:26 crc kubenswrapper[4839]: I0321 04:43:26.525869 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ng2tw" event={"ID":"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1","Type":"ContainerStarted","Data":"65ed5290939376cc07f297d06efe7a7f9acbf33da55d639c2cc318b6e8be4b9e"} Mar 21 04:43:26 crc kubenswrapper[4839]: I0321 04:43:26.554677 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qt5s4-config-zb6w6"] Mar 21 04:43:26 crc kubenswrapper[4839]: W0321 04:43:26.556343 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a1c7cc4_a44f_4c22_ac1c_9ef543768cf7.slice/crio-c979bc2884e80382e07c8880ad5376980127eb35c7a6271458a4d919ac49e89d WatchSource:0}: Error finding container c979bc2884e80382e07c8880ad5376980127eb35c7a6271458a4d919ac49e89d: Status 404 returned error can't find the container with id c979bc2884e80382e07c8880ad5376980127eb35c7a6271458a4d919ac49e89d Mar 21 04:43:27 crc kubenswrapper[4839]: I0321 04:43:27.537260 4839 generic.go:334] "Generic (PLEG): container finished" podID="9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" containerID="4f187e9e33fa923f2b2629c019ef104918ae6112912ac0480384a8c6a651a762" exitCode=0 Mar 21 04:43:27 crc kubenswrapper[4839]: I0321 04:43:27.537541 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qt5s4-config-zb6w6" event={"ID":"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7","Type":"ContainerDied","Data":"4f187e9e33fa923f2b2629c019ef104918ae6112912ac0480384a8c6a651a762"} Mar 21 04:43:27 crc kubenswrapper[4839]: I0321 04:43:27.537618 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qt5s4-config-zb6w6" event={"ID":"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7","Type":"ContainerStarted","Data":"c979bc2884e80382e07c8880ad5376980127eb35c7a6271458a4d919ac49e89d"} Mar 21 04:43:28 crc kubenswrapper[4839]: I0321 04:43:28.473070 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-fgh6w"] Mar 21 04:43:28 crc kubenswrapper[4839]: I0321 04:43:28.489944 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-fgh6w"] Mar 21 04:43:28 crc kubenswrapper[4839]: I0321 04:43:28.922180 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.015855 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-log-ovn\") pod \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.015953 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-additional-scripts\") pod \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.015955 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" (UID: "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.016032 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxxxk\" (UniqueName: \"kubernetes.io/projected/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-kube-api-access-wxxxk\") pod \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.016048 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run\") pod \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.016091 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-scripts\") pod \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.016155 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run-ovn\") pod \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\" (UID: \"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7\") " Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.016516 4839 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.016583 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" (UID: "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.016588 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run" (OuterVolumeSpecName: "var-run") pod "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" (UID: "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.017068 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" (UID: "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.017726 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-scripts" (OuterVolumeSpecName: "scripts") pod "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" (UID: "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.033347 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-kube-api-access-wxxxk" (OuterVolumeSpecName: "kube-api-access-wxxxk") pod "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" (UID: "9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7"). InnerVolumeSpecName "kube-api-access-wxxxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.117589 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.117619 4839 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.117629 4839 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.117639 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxxxk\" (UniqueName: \"kubernetes.io/projected/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-kube-api-access-wxxxk\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.117648 4839 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7-var-run\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.556389 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qt5s4-config-zb6w6" event={"ID":"9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7","Type":"ContainerDied","Data":"c979bc2884e80382e07c8880ad5376980127eb35c7a6271458a4d919ac49e89d"} Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.556428 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c979bc2884e80382e07c8880ad5376980127eb35c7a6271458a4d919ac49e89d" Mar 21 04:43:29 crc kubenswrapper[4839]: I0321 04:43:29.556479 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qt5s4-config-zb6w6" Mar 21 04:43:30 crc kubenswrapper[4839]: I0321 04:43:30.004717 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-qt5s4-config-zb6w6"] Mar 21 04:43:30 crc kubenswrapper[4839]: I0321 04:43:30.011836 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-qt5s4-config-zb6w6"] Mar 21 04:43:30 crc kubenswrapper[4839]: I0321 04:43:30.462030 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fc450b4-4ccf-4e6e-97d1-d47f252be788" path="/var/lib/kubelet/pods/0fc450b4-4ccf-4e6e-97d1-d47f252be788/volumes" Mar 21 04:43:30 crc kubenswrapper[4839]: I0321 04:43:30.463254 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" path="/var/lib/kubelet/pods/9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7/volumes" Mar 21 04:43:31 crc kubenswrapper[4839]: I0321 04:43:31.456210 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:43:31 crc kubenswrapper[4839]: I0321 04:43:31.462828 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9848d2f0-c562-4b2a-bd1c-cd91c6754079-etc-swift\") pod \"swift-storage-0\" (UID: \"9848d2f0-c562-4b2a-bd1c-cd91c6754079\") " pod="openstack/swift-storage-0" Mar 21 04:43:31 crc kubenswrapper[4839]: I0321 04:43:31.518996 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 21 04:43:31 crc kubenswrapper[4839]: I0321 04:43:31.852764 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:43:32 crc kubenswrapper[4839]: I0321 04:43:32.200771 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.496653 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-xc9zf"] Mar 21 04:43:33 crc kubenswrapper[4839]: E0321 04:43:33.497385 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" containerName="ovn-config" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.497400 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" containerName="ovn-config" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.497610 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a1c7cc4-a44f-4c22-ac1c-9ef543768cf7" containerName="ovn-config" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.498177 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xc9zf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.501992 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.517615 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xc9zf"] Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.600895 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-operator-scripts\") pod \"root-account-create-update-xc9zf\" (UID: \"b567c69c-d110-4ab2-aaf7-da82f0e72cc3\") " pod="openstack/root-account-create-update-xc9zf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.601041 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtcj5\" (UniqueName: \"kubernetes.io/projected/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-kube-api-access-mtcj5\") pod \"root-account-create-update-xc9zf\" (UID: \"b567c69c-d110-4ab2-aaf7-da82f0e72cc3\") " pod="openstack/root-account-create-update-xc9zf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.702431 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtcj5\" (UniqueName: \"kubernetes.io/projected/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-kube-api-access-mtcj5\") pod \"root-account-create-update-xc9zf\" (UID: \"b567c69c-d110-4ab2-aaf7-da82f0e72cc3\") " pod="openstack/root-account-create-update-xc9zf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.702512 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-operator-scripts\") pod \"root-account-create-update-xc9zf\" (UID: \"b567c69c-d110-4ab2-aaf7-da82f0e72cc3\") " pod="openstack/root-account-create-update-xc9zf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.703243 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-operator-scripts\") pod \"root-account-create-update-xc9zf\" (UID: \"b567c69c-d110-4ab2-aaf7-da82f0e72cc3\") " pod="openstack/root-account-create-update-xc9zf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.725631 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtcj5\" (UniqueName: \"kubernetes.io/projected/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-kube-api-access-mtcj5\") pod \"root-account-create-update-xc9zf\" (UID: \"b567c69c-d110-4ab2-aaf7-da82f0e72cc3\") " pod="openstack/root-account-create-update-xc9zf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.814443 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-nv7qf"] Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.815882 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nv7qf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.823041 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-nv7qf"] Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.826761 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xc9zf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.909199 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3d03-account-create-update-q9rgd"] Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.910484 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3d03-account-create-update-q9rgd" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.916034 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-operator-scripts\") pod \"cinder-db-create-nv7qf\" (UID: \"ae4cf7f5-74ed-45d7-ace7-24ada744db6c\") " pod="openstack/cinder-db-create-nv7qf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.916121 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cgqq\" (UniqueName: \"kubernetes.io/projected/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-kube-api-access-7cgqq\") pod \"cinder-db-create-nv7qf\" (UID: \"ae4cf7f5-74ed-45d7-ace7-24ada744db6c\") " pod="openstack/cinder-db-create-nv7qf" Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.920444 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3d03-account-create-update-q9rgd"] Mar 21 04:43:33 crc kubenswrapper[4839]: I0321 04:43:33.928445 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.011680 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-h5448"] Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.013265 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h5448" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.019255 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cgqq\" (UniqueName: \"kubernetes.io/projected/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-kube-api-access-7cgqq\") pod \"cinder-db-create-nv7qf\" (UID: \"ae4cf7f5-74ed-45d7-ace7-24ada744db6c\") " pod="openstack/cinder-db-create-nv7qf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.019317 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4t9p\" (UniqueName: \"kubernetes.io/projected/a7dfdbcf-7830-4f8d-a165-119fe80d999a-kube-api-access-b4t9p\") pod \"cinder-3d03-account-create-update-q9rgd\" (UID: \"a7dfdbcf-7830-4f8d-a165-119fe80d999a\") " pod="openstack/cinder-3d03-account-create-update-q9rgd" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.019460 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7dfdbcf-7830-4f8d-a165-119fe80d999a-operator-scripts\") pod \"cinder-3d03-account-create-update-q9rgd\" (UID: \"a7dfdbcf-7830-4f8d-a165-119fe80d999a\") " pod="openstack/cinder-3d03-account-create-update-q9rgd" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.019485 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-operator-scripts\") pod \"cinder-db-create-nv7qf\" (UID: \"ae4cf7f5-74ed-45d7-ace7-24ada744db6c\") " pod="openstack/cinder-db-create-nv7qf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.020757 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-operator-scripts\") pod \"cinder-db-create-nv7qf\" (UID: \"ae4cf7f5-74ed-45d7-ace7-24ada744db6c\") " pod="openstack/cinder-db-create-nv7qf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.036107 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-h5448"] Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.064759 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cgqq\" (UniqueName: \"kubernetes.io/projected/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-kube-api-access-7cgqq\") pod \"cinder-db-create-nv7qf\" (UID: \"ae4cf7f5-74ed-45d7-ace7-24ada744db6c\") " pod="openstack/cinder-db-create-nv7qf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.105150 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-79rjr"] Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.122313 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4t9p\" (UniqueName: \"kubernetes.io/projected/a7dfdbcf-7830-4f8d-a165-119fe80d999a-kube-api-access-b4t9p\") pod \"cinder-3d03-account-create-update-q9rgd\" (UID: \"a7dfdbcf-7830-4f8d-a165-119fe80d999a\") " pod="openstack/cinder-3d03-account-create-update-q9rgd" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.122381 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5brcn\" (UniqueName: \"kubernetes.io/projected/34a240db-9587-446e-af12-a44b87b1a3ac-kube-api-access-5brcn\") pod \"barbican-db-create-h5448\" (UID: \"34a240db-9587-446e-af12-a44b87b1a3ac\") " pod="openstack/barbican-db-create-h5448" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.122443 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34a240db-9587-446e-af12-a44b87b1a3ac-operator-scripts\") pod \"barbican-db-create-h5448\" (UID: \"34a240db-9587-446e-af12-a44b87b1a3ac\") " pod="openstack/barbican-db-create-h5448" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.122479 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7dfdbcf-7830-4f8d-a165-119fe80d999a-operator-scripts\") pod \"cinder-3d03-account-create-update-q9rgd\" (UID: \"a7dfdbcf-7830-4f8d-a165-119fe80d999a\") " pod="openstack/cinder-3d03-account-create-update-q9rgd" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.123173 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7dfdbcf-7830-4f8d-a165-119fe80d999a-operator-scripts\") pod \"cinder-3d03-account-create-update-q9rgd\" (UID: \"a7dfdbcf-7830-4f8d-a165-119fe80d999a\") " pod="openstack/cinder-3d03-account-create-update-q9rgd" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.128098 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-79rjr" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.129302 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-79rjr"] Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.136617 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nv7qf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.141816 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-4d6f-account-create-update-st2sv"] Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.144481 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4d6f-account-create-update-st2sv" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.151083 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4d6f-account-create-update-st2sv"] Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.152958 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.163199 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4t9p\" (UniqueName: \"kubernetes.io/projected/a7dfdbcf-7830-4f8d-a165-119fe80d999a-kube-api-access-b4t9p\") pod \"cinder-3d03-account-create-update-q9rgd\" (UID: \"a7dfdbcf-7830-4f8d-a165-119fe80d999a\") " pod="openstack/cinder-3d03-account-create-update-q9rgd" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.217584 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-qgdlf"] Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.218808 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.221773 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pzsvm" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.224052 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34a240db-9587-446e-af12-a44b87b1a3ac-operator-scripts\") pod \"barbican-db-create-h5448\" (UID: \"34a240db-9587-446e-af12-a44b87b1a3ac\") " pod="openstack/barbican-db-create-h5448" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.224105 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c8ad856-1b19-4b1c-8124-2e316dd567ee-operator-scripts\") pod \"neutron-4d6f-account-create-update-st2sv\" (UID: \"8c8ad856-1b19-4b1c-8124-2e316dd567ee\") " pod="openstack/neutron-4d6f-account-create-update-st2sv" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.225262 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.226040 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34a240db-9587-446e-af12-a44b87b1a3ac-operator-scripts\") pod \"barbican-db-create-h5448\" (UID: \"34a240db-9587-446e-af12-a44b87b1a3ac\") " pod="openstack/barbican-db-create-h5448" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.228139 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gmsm\" (UniqueName: \"kubernetes.io/projected/8c8ad856-1b19-4b1c-8124-2e316dd567ee-kube-api-access-9gmsm\") pod \"neutron-4d6f-account-create-update-st2sv\" (UID: \"8c8ad856-1b19-4b1c-8124-2e316dd567ee\") " pod="openstack/neutron-4d6f-account-create-update-st2sv" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.228790 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfp6q\" (UniqueName: \"kubernetes.io/projected/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-kube-api-access-sfp6q\") pod \"neutron-db-create-79rjr\" (UID: \"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d\") " pod="openstack/neutron-db-create-79rjr" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.228879 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5brcn\" (UniqueName: \"kubernetes.io/projected/34a240db-9587-446e-af12-a44b87b1a3ac-kube-api-access-5brcn\") pod \"barbican-db-create-h5448\" (UID: \"34a240db-9587-446e-af12-a44b87b1a3ac\") " pod="openstack/barbican-db-create-h5448" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.228971 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-operator-scripts\") pod \"neutron-db-create-79rjr\" (UID: \"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d\") " pod="openstack/neutron-db-create-79rjr" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.229402 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3d03-account-create-update-q9rgd" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.231886 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.237081 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.265763 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qgdlf"] Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.267323 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5brcn\" (UniqueName: \"kubernetes.io/projected/34a240db-9587-446e-af12-a44b87b1a3ac-kube-api-access-5brcn\") pod \"barbican-db-create-h5448\" (UID: \"34a240db-9587-446e-af12-a44b87b1a3ac\") " pod="openstack/barbican-db-create-h5448" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.273861 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-a5f8-account-create-update-2srvb"] Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.275371 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a5f8-account-create-update-2srvb" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.278454 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.293705 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a5f8-account-create-update-2srvb"] Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.330104 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfp6q\" (UniqueName: \"kubernetes.io/projected/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-kube-api-access-sfp6q\") pod \"neutron-db-create-79rjr\" (UID: \"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d\") " pod="openstack/neutron-db-create-79rjr" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.330156 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-config-data\") pod \"keystone-db-sync-qgdlf\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.330203 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-combined-ca-bundle\") pod \"keystone-db-sync-qgdlf\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.330231 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-operator-scripts\") pod \"neutron-db-create-79rjr\" (UID: \"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d\") " pod="openstack/neutron-db-create-79rjr" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.330687 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcwg2\" (UniqueName: \"kubernetes.io/projected/bc21c34c-13c1-4733-9013-0cfd304b179c-kube-api-access-vcwg2\") pod \"keystone-db-sync-qgdlf\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.330726 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c8ad856-1b19-4b1c-8124-2e316dd567ee-operator-scripts\") pod \"neutron-4d6f-account-create-update-st2sv\" (UID: \"8c8ad856-1b19-4b1c-8124-2e316dd567ee\") " pod="openstack/neutron-4d6f-account-create-update-st2sv" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.330751 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gmsm\" (UniqueName: \"kubernetes.io/projected/8c8ad856-1b19-4b1c-8124-2e316dd567ee-kube-api-access-9gmsm\") pod \"neutron-4d6f-account-create-update-st2sv\" (UID: \"8c8ad856-1b19-4b1c-8124-2e316dd567ee\") " pod="openstack/neutron-4d6f-account-create-update-st2sv" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.331005 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-operator-scripts\") pod \"neutron-db-create-79rjr\" (UID: \"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d\") " pod="openstack/neutron-db-create-79rjr" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.331688 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c8ad856-1b19-4b1c-8124-2e316dd567ee-operator-scripts\") pod \"neutron-4d6f-account-create-update-st2sv\" (UID: \"8c8ad856-1b19-4b1c-8124-2e316dd567ee\") " pod="openstack/neutron-4d6f-account-create-update-st2sv" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.339226 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h5448" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.346759 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gmsm\" (UniqueName: \"kubernetes.io/projected/8c8ad856-1b19-4b1c-8124-2e316dd567ee-kube-api-access-9gmsm\") pod \"neutron-4d6f-account-create-update-st2sv\" (UID: \"8c8ad856-1b19-4b1c-8124-2e316dd567ee\") " pod="openstack/neutron-4d6f-account-create-update-st2sv" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.363252 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfp6q\" (UniqueName: \"kubernetes.io/projected/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-kube-api-access-sfp6q\") pod \"neutron-db-create-79rjr\" (UID: \"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d\") " pod="openstack/neutron-db-create-79rjr" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.432643 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-config-data\") pod \"keystone-db-sync-qgdlf\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.432730 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-combined-ca-bundle\") pod \"keystone-db-sync-qgdlf\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.432773 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcwg2\" (UniqueName: \"kubernetes.io/projected/bc21c34c-13c1-4733-9013-0cfd304b179c-kube-api-access-vcwg2\") pod \"keystone-db-sync-qgdlf\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.432840 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fm77\" (UniqueName: \"kubernetes.io/projected/f816daf8-a9c7-4e99-a622-2f9bee7d203a-kube-api-access-6fm77\") pod \"barbican-a5f8-account-create-update-2srvb\" (UID: \"f816daf8-a9c7-4e99-a622-2f9bee7d203a\") " pod="openstack/barbican-a5f8-account-create-update-2srvb" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.432875 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f816daf8-a9c7-4e99-a622-2f9bee7d203a-operator-scripts\") pod \"barbican-a5f8-account-create-update-2srvb\" (UID: \"f816daf8-a9c7-4e99-a622-2f9bee7d203a\") " pod="openstack/barbican-a5f8-account-create-update-2srvb" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.437295 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-combined-ca-bundle\") pod \"keystone-db-sync-qgdlf\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.438831 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-config-data\") pod \"keystone-db-sync-qgdlf\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.447108 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-79rjr" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.449526 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcwg2\" (UniqueName: \"kubernetes.io/projected/bc21c34c-13c1-4733-9013-0cfd304b179c-kube-api-access-vcwg2\") pod \"keystone-db-sync-qgdlf\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.464001 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4d6f-account-create-update-st2sv" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.533948 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fm77\" (UniqueName: \"kubernetes.io/projected/f816daf8-a9c7-4e99-a622-2f9bee7d203a-kube-api-access-6fm77\") pod \"barbican-a5f8-account-create-update-2srvb\" (UID: \"f816daf8-a9c7-4e99-a622-2f9bee7d203a\") " pod="openstack/barbican-a5f8-account-create-update-2srvb" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.534274 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f816daf8-a9c7-4e99-a622-2f9bee7d203a-operator-scripts\") pod \"barbican-a5f8-account-create-update-2srvb\" (UID: \"f816daf8-a9c7-4e99-a622-2f9bee7d203a\") " pod="openstack/barbican-a5f8-account-create-update-2srvb" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.534936 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f816daf8-a9c7-4e99-a622-2f9bee7d203a-operator-scripts\") pod \"barbican-a5f8-account-create-update-2srvb\" (UID: \"f816daf8-a9c7-4e99-a622-2f9bee7d203a\") " pod="openstack/barbican-a5f8-account-create-update-2srvb" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.548817 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.564000 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fm77\" (UniqueName: \"kubernetes.io/projected/f816daf8-a9c7-4e99-a622-2f9bee7d203a-kube-api-access-6fm77\") pod \"barbican-a5f8-account-create-update-2srvb\" (UID: \"f816daf8-a9c7-4e99-a622-2f9bee7d203a\") " pod="openstack/barbican-a5f8-account-create-update-2srvb" Mar 21 04:43:34 crc kubenswrapper[4839]: I0321 04:43:34.615252 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a5f8-account-create-update-2srvb" Mar 21 04:43:39 crc kubenswrapper[4839]: I0321 04:43:39.942856 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3d03-account-create-update-q9rgd"] Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.172476 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4d6f-account-create-update-st2sv"] Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.183124 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-xc9zf"] Mar 21 04:43:40 crc kubenswrapper[4839]: W0321 04:43:40.187368 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb567c69c_d110_4ab2_aaf7_da82f0e72cc3.slice/crio-e3011e29981c589ee376b9ae98f36e4cbafbbd3d1eaa73e7fc08679d2e065608 WatchSource:0}: Error finding container e3011e29981c589ee376b9ae98f36e4cbafbbd3d1eaa73e7fc08679d2e065608: Status 404 returned error can't find the container with id e3011e29981c589ee376b9ae98f36e4cbafbbd3d1eaa73e7fc08679d2e065608 Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.289210 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a5f8-account-create-update-2srvb"] Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.324711 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qgdlf"] Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.338417 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-nv7qf"] Mar 21 04:43:40 crc kubenswrapper[4839]: W0321 04:43:40.361706 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf816daf8_a9c7_4e99_a622_2f9bee7d203a.slice/crio-94ac03e112c80ef0d5418ea52888bc6a7eac3f10db846d5d3635730b5e38afad WatchSource:0}: Error finding container 94ac03e112c80ef0d5418ea52888bc6a7eac3f10db846d5d3635730b5e38afad: Status 404 returned error can't find the container with id 94ac03e112c80ef0d5418ea52888bc6a7eac3f10db846d5d3635730b5e38afad Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.495847 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 21 04:43:40 crc kubenswrapper[4839]: W0321 04:43:40.497684 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9848d2f0_c562_4b2a_bd1c_cd91c6754079.slice/crio-e698f98cfc8ae5d6f5ea4aa44381ba9e5573fe7604841a4a4c09e46cce5d1ce7 WatchSource:0}: Error finding container e698f98cfc8ae5d6f5ea4aa44381ba9e5573fe7604841a4a4c09e46cce5d1ce7: Status 404 returned error can't find the container with id e698f98cfc8ae5d6f5ea4aa44381ba9e5573fe7604841a4a4c09e46cce5d1ce7 Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.520709 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-h5448"] Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.537515 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-79rjr"] Mar 21 04:43:40 crc kubenswrapper[4839]: W0321 04:43:40.566916 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34a240db_9587_446e_af12_a44b87b1a3ac.slice/crio-31e3db5d72b01022f1986c2a2ec56c7bc9d0988d79194c9f2f0ab8aa647233ba WatchSource:0}: Error finding container 31e3db5d72b01022f1986c2a2ec56c7bc9d0988d79194c9f2f0ab8aa647233ba: Status 404 returned error can't find the container with id 31e3db5d72b01022f1986c2a2ec56c7bc9d0988d79194c9f2f0ab8aa647233ba Mar 21 04:43:40 crc kubenswrapper[4839]: W0321 04:43:40.577429 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a5cee9b_67b3_40b1_bc62_e6a3c4c1272d.slice/crio-47ea97309a6522272d1049c002a6ef10748bf6ff9838c09443d1183896d8f227 WatchSource:0}: Error finding container 47ea97309a6522272d1049c002a6ef10748bf6ff9838c09443d1183896d8f227: Status 404 returned error can't find the container with id 47ea97309a6522272d1049c002a6ef10748bf6ff9838c09443d1183896d8f227 Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.687501 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a5f8-account-create-update-2srvb" event={"ID":"f816daf8-a9c7-4e99-a622-2f9bee7d203a","Type":"ContainerStarted","Data":"94ac03e112c80ef0d5418ea52888bc6a7eac3f10db846d5d3635730b5e38afad"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.689652 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"e698f98cfc8ae5d6f5ea4aa44381ba9e5573fe7604841a4a4c09e46cce5d1ce7"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.691585 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-79rjr" event={"ID":"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d","Type":"ContainerStarted","Data":"47ea97309a6522272d1049c002a6ef10748bf6ff9838c09443d1183896d8f227"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.693113 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qgdlf" event={"ID":"bc21c34c-13c1-4733-9013-0cfd304b179c","Type":"ContainerStarted","Data":"5aa954aabfea506f0a5478f8c3a3555f34687b1de83455cee6f271375bfc4c66"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.695447 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3d03-account-create-update-q9rgd" event={"ID":"a7dfdbcf-7830-4f8d-a165-119fe80d999a","Type":"ContainerStarted","Data":"1fd2a9e659b1f417a5acc26d40481b58d37731bc164379bcd010cc11a61ef9ec"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.695479 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3d03-account-create-update-q9rgd" event={"ID":"a7dfdbcf-7830-4f8d-a165-119fe80d999a","Type":"ContainerStarted","Data":"33a682b67cb79b3a9c84f654296a23ef31bd7ed28ffe98f4e4f81581ace6187b"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.700403 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4d6f-account-create-update-st2sv" event={"ID":"8c8ad856-1b19-4b1c-8124-2e316dd567ee","Type":"ContainerStarted","Data":"61e0e39b96984fba4ce019fbf23e5e9abac32712ce1dead1f6d41d879dfd2bde"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.701499 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h5448" event={"ID":"34a240db-9587-446e-af12-a44b87b1a3ac","Type":"ContainerStarted","Data":"31e3db5d72b01022f1986c2a2ec56c7bc9d0988d79194c9f2f0ab8aa647233ba"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.705498 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xc9zf" event={"ID":"b567c69c-d110-4ab2-aaf7-da82f0e72cc3","Type":"ContainerStarted","Data":"1c106f22b4e401c674f904200a929e5e68e3e4f4a62178a136c50ceb882cf719"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.705542 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xc9zf" event={"ID":"b567c69c-d110-4ab2-aaf7-da82f0e72cc3","Type":"ContainerStarted","Data":"e3011e29981c589ee376b9ae98f36e4cbafbbd3d1eaa73e7fc08679d2e065608"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.710457 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nv7qf" event={"ID":"ae4cf7f5-74ed-45d7-ace7-24ada744db6c","Type":"ContainerStarted","Data":"534939e65abd8a9805fd9f039f7ba98bd25147b5037e43d0a993f57bc2e141ef"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.715067 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-3d03-account-create-update-q9rgd" podStartSLOduration=7.715049071 podStartE2EDuration="7.715049071s" podCreationTimestamp="2026-03-21 04:43:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:43:40.710559066 +0000 UTC m=+1225.038345742" watchObservedRunningTime="2026-03-21 04:43:40.715049071 +0000 UTC m=+1225.042835747" Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.716893 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ng2tw" event={"ID":"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1","Type":"ContainerStarted","Data":"79604402661ee3c465cb72ff146dbc568553c3204385175c4f68e9dccfa5a6c6"} Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.745367 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-xc9zf" podStartSLOduration=7.745349099 podStartE2EDuration="7.745349099s" podCreationTimestamp="2026-03-21 04:43:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:43:40.740707329 +0000 UTC m=+1225.068494005" watchObservedRunningTime="2026-03-21 04:43:40.745349099 +0000 UTC m=+1225.073135775" Mar 21 04:43:40 crc kubenswrapper[4839]: I0321 04:43:40.778861 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-ng2tw" podStartSLOduration=2.781630836 podStartE2EDuration="15.778831086s" podCreationTimestamp="2026-03-21 04:43:25 +0000 UTC" firstStartedPulling="2026-03-21 04:43:26.499345993 +0000 UTC m=+1210.827132669" lastFinishedPulling="2026-03-21 04:43:39.496546243 +0000 UTC m=+1223.824332919" observedRunningTime="2026-03-21 04:43:40.768838886 +0000 UTC m=+1225.096625572" watchObservedRunningTime="2026-03-21 04:43:40.778831086 +0000 UTC m=+1225.106617762" Mar 21 04:43:41 crc kubenswrapper[4839]: E0321 04:43:41.334748 4839 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf816daf8_a9c7_4e99_a622_2f9bee7d203a.slice/crio-conmon-1fc9b78e56e247468e98f90edfea187e15daf4ea152975c90e4d68c87986ba79.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.741323 4839 generic.go:334] "Generic (PLEG): container finished" podID="f816daf8-a9c7-4e99-a622-2f9bee7d203a" containerID="1fc9b78e56e247468e98f90edfea187e15daf4ea152975c90e4d68c87986ba79" exitCode=0 Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.741893 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a5f8-account-create-update-2srvb" event={"ID":"f816daf8-a9c7-4e99-a622-2f9bee7d203a","Type":"ContainerDied","Data":"1fc9b78e56e247468e98f90edfea187e15daf4ea152975c90e4d68c87986ba79"} Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.747808 4839 generic.go:334] "Generic (PLEG): container finished" podID="34a240db-9587-446e-af12-a44b87b1a3ac" containerID="3b8d2c5a2c9686ff0f867f32b67ae1a47eca812fdcbd3b93adee22c245151532" exitCode=0 Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.747892 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h5448" event={"ID":"34a240db-9587-446e-af12-a44b87b1a3ac","Type":"ContainerDied","Data":"3b8d2c5a2c9686ff0f867f32b67ae1a47eca812fdcbd3b93adee22c245151532"} Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.751212 4839 generic.go:334] "Generic (PLEG): container finished" podID="b567c69c-d110-4ab2-aaf7-da82f0e72cc3" containerID="1c106f22b4e401c674f904200a929e5e68e3e4f4a62178a136c50ceb882cf719" exitCode=0 Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.751303 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xc9zf" event={"ID":"b567c69c-d110-4ab2-aaf7-da82f0e72cc3","Type":"ContainerDied","Data":"1c106f22b4e401c674f904200a929e5e68e3e4f4a62178a136c50ceb882cf719"} Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.752946 4839 generic.go:334] "Generic (PLEG): container finished" podID="ae4cf7f5-74ed-45d7-ace7-24ada744db6c" containerID="3a57c10b2c78e441d84cbfab2416b69cf3571cc562b77b3b3134c8875131a599" exitCode=0 Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.753033 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nv7qf" event={"ID":"ae4cf7f5-74ed-45d7-ace7-24ada744db6c","Type":"ContainerDied","Data":"3a57c10b2c78e441d84cbfab2416b69cf3571cc562b77b3b3134c8875131a599"} Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.754279 4839 generic.go:334] "Generic (PLEG): container finished" podID="9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d" containerID="ebddf9a5729dde6feb4416ede20f92a3dd052bc816ed0593e001a7eb65da5807" exitCode=0 Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.754338 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-79rjr" event={"ID":"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d","Type":"ContainerDied","Data":"ebddf9a5729dde6feb4416ede20f92a3dd052bc816ed0593e001a7eb65da5807"} Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.760629 4839 generic.go:334] "Generic (PLEG): container finished" podID="a7dfdbcf-7830-4f8d-a165-119fe80d999a" containerID="1fd2a9e659b1f417a5acc26d40481b58d37731bc164379bcd010cc11a61ef9ec" exitCode=0 Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.760870 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3d03-account-create-update-q9rgd" event={"ID":"a7dfdbcf-7830-4f8d-a165-119fe80d999a","Type":"ContainerDied","Data":"1fd2a9e659b1f417a5acc26d40481b58d37731bc164379bcd010cc11a61ef9ec"} Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.764886 4839 generic.go:334] "Generic (PLEG): container finished" podID="8c8ad856-1b19-4b1c-8124-2e316dd567ee" containerID="032f3b05c1ff562800fafe59fd0384b7d678921d7fe7e90157dab690dc2e9894" exitCode=0 Mar 21 04:43:41 crc kubenswrapper[4839]: I0321 04:43:41.764965 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4d6f-account-create-update-st2sv" event={"ID":"8c8ad856-1b19-4b1c-8124-2e316dd567ee","Type":"ContainerDied","Data":"032f3b05c1ff562800fafe59fd0384b7d678921d7fe7e90157dab690dc2e9894"} Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.293761 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3d03-account-create-update-q9rgd" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.301337 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nv7qf" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.316138 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h5448" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.323370 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-79rjr" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.330932 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4d6f-account-create-update-st2sv" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.343340 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xc9zf" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.358448 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7dfdbcf-7830-4f8d-a165-119fe80d999a-operator-scripts\") pod \"a7dfdbcf-7830-4f8d-a165-119fe80d999a\" (UID: \"a7dfdbcf-7830-4f8d-a165-119fe80d999a\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.358603 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4t9p\" (UniqueName: \"kubernetes.io/projected/a7dfdbcf-7830-4f8d-a165-119fe80d999a-kube-api-access-b4t9p\") pod \"a7dfdbcf-7830-4f8d-a165-119fe80d999a\" (UID: \"a7dfdbcf-7830-4f8d-a165-119fe80d999a\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.358681 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cgqq\" (UniqueName: \"kubernetes.io/projected/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-kube-api-access-7cgqq\") pod \"ae4cf7f5-74ed-45d7-ace7-24ada744db6c\" (UID: \"ae4cf7f5-74ed-45d7-ace7-24ada744db6c\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.358789 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-operator-scripts\") pod \"ae4cf7f5-74ed-45d7-ace7-24ada744db6c\" (UID: \"ae4cf7f5-74ed-45d7-ace7-24ada744db6c\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.358955 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a5f8-account-create-update-2srvb" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.360382 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ae4cf7f5-74ed-45d7-ace7-24ada744db6c" (UID: "ae4cf7f5-74ed-45d7-ace7-24ada744db6c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.362074 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7dfdbcf-7830-4f8d-a165-119fe80d999a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7dfdbcf-7830-4f8d-a165-119fe80d999a" (UID: "a7dfdbcf-7830-4f8d-a165-119fe80d999a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.369136 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-kube-api-access-7cgqq" (OuterVolumeSpecName: "kube-api-access-7cgqq") pod "ae4cf7f5-74ed-45d7-ace7-24ada744db6c" (UID: "ae4cf7f5-74ed-45d7-ace7-24ada744db6c"). InnerVolumeSpecName "kube-api-access-7cgqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.384636 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7dfdbcf-7830-4f8d-a165-119fe80d999a-kube-api-access-b4t9p" (OuterVolumeSpecName: "kube-api-access-b4t9p") pod "a7dfdbcf-7830-4f8d-a165-119fe80d999a" (UID: "a7dfdbcf-7830-4f8d-a165-119fe80d999a"). InnerVolumeSpecName "kube-api-access-b4t9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.460781 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c8ad856-1b19-4b1c-8124-2e316dd567ee-operator-scripts\") pod \"8c8ad856-1b19-4b1c-8124-2e316dd567ee\" (UID: \"8c8ad856-1b19-4b1c-8124-2e316dd567ee\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.461338 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfp6q\" (UniqueName: \"kubernetes.io/projected/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-kube-api-access-sfp6q\") pod \"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d\" (UID: \"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.461383 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c8ad856-1b19-4b1c-8124-2e316dd567ee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c8ad856-1b19-4b1c-8124-2e316dd567ee" (UID: "8c8ad856-1b19-4b1c-8124-2e316dd567ee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.461474 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fm77\" (UniqueName: \"kubernetes.io/projected/f816daf8-a9c7-4e99-a622-2f9bee7d203a-kube-api-access-6fm77\") pod \"f816daf8-a9c7-4e99-a622-2f9bee7d203a\" (UID: \"f816daf8-a9c7-4e99-a622-2f9bee7d203a\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.461507 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtcj5\" (UniqueName: \"kubernetes.io/projected/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-kube-api-access-mtcj5\") pod \"b567c69c-d110-4ab2-aaf7-da82f0e72cc3\" (UID: \"b567c69c-d110-4ab2-aaf7-da82f0e72cc3\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.461587 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f816daf8-a9c7-4e99-a622-2f9bee7d203a-operator-scripts\") pod \"f816daf8-a9c7-4e99-a622-2f9bee7d203a\" (UID: \"f816daf8-a9c7-4e99-a622-2f9bee7d203a\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.461618 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34a240db-9587-446e-af12-a44b87b1a3ac-operator-scripts\") pod \"34a240db-9587-446e-af12-a44b87b1a3ac\" (UID: \"34a240db-9587-446e-af12-a44b87b1a3ac\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.461730 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5brcn\" (UniqueName: \"kubernetes.io/projected/34a240db-9587-446e-af12-a44b87b1a3ac-kube-api-access-5brcn\") pod \"34a240db-9587-446e-af12-a44b87b1a3ac\" (UID: \"34a240db-9587-446e-af12-a44b87b1a3ac\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.461761 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-operator-scripts\") pod \"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d\" (UID: \"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.461792 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-operator-scripts\") pod \"b567c69c-d110-4ab2-aaf7-da82f0e72cc3\" (UID: \"b567c69c-d110-4ab2-aaf7-da82f0e72cc3\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.461922 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gmsm\" (UniqueName: \"kubernetes.io/projected/8c8ad856-1b19-4b1c-8124-2e316dd567ee-kube-api-access-9gmsm\") pod \"8c8ad856-1b19-4b1c-8124-2e316dd567ee\" (UID: \"8c8ad856-1b19-4b1c-8124-2e316dd567ee\") " Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.462742 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f816daf8-a9c7-4e99-a622-2f9bee7d203a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f816daf8-a9c7-4e99-a622-2f9bee7d203a" (UID: "f816daf8-a9c7-4e99-a622-2f9bee7d203a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.462747 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d" (UID: "9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.462754 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b567c69c-d110-4ab2-aaf7-da82f0e72cc3" (UID: "b567c69c-d110-4ab2-aaf7-da82f0e72cc3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.463088 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34a240db-9587-446e-af12-a44b87b1a3ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "34a240db-9587-446e-af12-a44b87b1a3ac" (UID: "34a240db-9587-446e-af12-a44b87b1a3ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.463383 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.463538 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.463681 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4t9p\" (UniqueName: \"kubernetes.io/projected/a7dfdbcf-7830-4f8d-a165-119fe80d999a-kube-api-access-b4t9p\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.463770 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cgqq\" (UniqueName: \"kubernetes.io/projected/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-kube-api-access-7cgqq\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.463863 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae4cf7f5-74ed-45d7-ace7-24ada744db6c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.463945 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c8ad856-1b19-4b1c-8124-2e316dd567ee-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.464054 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7dfdbcf-7830-4f8d-a165-119fe80d999a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.464170 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f816daf8-a9c7-4e99-a622-2f9bee7d203a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.464258 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34a240db-9587-446e-af12-a44b87b1a3ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.466696 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34a240db-9587-446e-af12-a44b87b1a3ac-kube-api-access-5brcn" (OuterVolumeSpecName: "kube-api-access-5brcn") pod "34a240db-9587-446e-af12-a44b87b1a3ac" (UID: "34a240db-9587-446e-af12-a44b87b1a3ac"). InnerVolumeSpecName "kube-api-access-5brcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.476737 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-kube-api-access-sfp6q" (OuterVolumeSpecName: "kube-api-access-sfp6q") pod "9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d" (UID: "9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d"). InnerVolumeSpecName "kube-api-access-sfp6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.479556 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c8ad856-1b19-4b1c-8124-2e316dd567ee-kube-api-access-9gmsm" (OuterVolumeSpecName: "kube-api-access-9gmsm") pod "8c8ad856-1b19-4b1c-8124-2e316dd567ee" (UID: "8c8ad856-1b19-4b1c-8124-2e316dd567ee"). InnerVolumeSpecName "kube-api-access-9gmsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.480583 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f816daf8-a9c7-4e99-a622-2f9bee7d203a-kube-api-access-6fm77" (OuterVolumeSpecName: "kube-api-access-6fm77") pod "f816daf8-a9c7-4e99-a622-2f9bee7d203a" (UID: "f816daf8-a9c7-4e99-a622-2f9bee7d203a"). InnerVolumeSpecName "kube-api-access-6fm77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.498236 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-kube-api-access-mtcj5" (OuterVolumeSpecName: "kube-api-access-mtcj5") pod "b567c69c-d110-4ab2-aaf7-da82f0e72cc3" (UID: "b567c69c-d110-4ab2-aaf7-da82f0e72cc3"). InnerVolumeSpecName "kube-api-access-mtcj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.565771 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5brcn\" (UniqueName: \"kubernetes.io/projected/34a240db-9587-446e-af12-a44b87b1a3ac-kube-api-access-5brcn\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.565826 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gmsm\" (UniqueName: \"kubernetes.io/projected/8c8ad856-1b19-4b1c-8124-2e316dd567ee-kube-api-access-9gmsm\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.565835 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfp6q\" (UniqueName: \"kubernetes.io/projected/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d-kube-api-access-sfp6q\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.565845 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fm77\" (UniqueName: \"kubernetes.io/projected/f816daf8-a9c7-4e99-a622-2f9bee7d203a-kube-api-access-6fm77\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.565854 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtcj5\" (UniqueName: \"kubernetes.io/projected/b567c69c-d110-4ab2-aaf7-da82f0e72cc3-kube-api-access-mtcj5\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.807139 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4d6f-account-create-update-st2sv" event={"ID":"8c8ad856-1b19-4b1c-8124-2e316dd567ee","Type":"ContainerDied","Data":"61e0e39b96984fba4ce019fbf23e5e9abac32712ce1dead1f6d41d879dfd2bde"} Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.807187 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61e0e39b96984fba4ce019fbf23e5e9abac32712ce1dead1f6d41d879dfd2bde" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.807156 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4d6f-account-create-update-st2sv" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.808580 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a5f8-account-create-update-2srvb" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.808601 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a5f8-account-create-update-2srvb" event={"ID":"f816daf8-a9c7-4e99-a622-2f9bee7d203a","Type":"ContainerDied","Data":"94ac03e112c80ef0d5418ea52888bc6a7eac3f10db846d5d3635730b5e38afad"} Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.808651 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94ac03e112c80ef0d5418ea52888bc6a7eac3f10db846d5d3635730b5e38afad" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.811208 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-xc9zf" event={"ID":"b567c69c-d110-4ab2-aaf7-da82f0e72cc3","Type":"ContainerDied","Data":"e3011e29981c589ee376b9ae98f36e4cbafbbd3d1eaa73e7fc08679d2e065608"} Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.811252 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3011e29981c589ee376b9ae98f36e4cbafbbd3d1eaa73e7fc08679d2e065608" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.811309 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-xc9zf" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.812774 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3d03-account-create-update-q9rgd" event={"ID":"a7dfdbcf-7830-4f8d-a165-119fe80d999a","Type":"ContainerDied","Data":"33a682b67cb79b3a9c84f654296a23ef31bd7ed28ffe98f4e4f81581ace6187b"} Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.812800 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33a682b67cb79b3a9c84f654296a23ef31bd7ed28ffe98f4e4f81581ace6187b" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.812850 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3d03-account-create-update-q9rgd" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.815480 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-79rjr" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.815495 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-79rjr" event={"ID":"9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d","Type":"ContainerDied","Data":"47ea97309a6522272d1049c002a6ef10748bf6ff9838c09443d1183896d8f227"} Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.815525 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47ea97309a6522272d1049c002a6ef10748bf6ff9838c09443d1183896d8f227" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.818478 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h5448" event={"ID":"34a240db-9587-446e-af12-a44b87b1a3ac","Type":"ContainerDied","Data":"31e3db5d72b01022f1986c2a2ec56c7bc9d0988d79194c9f2f0ab8aa647233ba"} Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.818500 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31e3db5d72b01022f1986c2a2ec56c7bc9d0988d79194c9f2f0ab8aa647233ba" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.818551 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h5448" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.821385 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nv7qf" Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.821365 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nv7qf" event={"ID":"ae4cf7f5-74ed-45d7-ace7-24ada744db6c","Type":"ContainerDied","Data":"534939e65abd8a9805fd9f039f7ba98bd25147b5037e43d0a993f57bc2e141ef"} Mar 21 04:43:46 crc kubenswrapper[4839]: I0321 04:43:46.821682 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="534939e65abd8a9805fd9f039f7ba98bd25147b5037e43d0a993f57bc2e141ef" Mar 21 04:43:51 crc kubenswrapper[4839]: I0321 04:43:51.864863 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qgdlf" event={"ID":"bc21c34c-13c1-4733-9013-0cfd304b179c","Type":"ContainerStarted","Data":"412e0d9615c7dcab7728f617fda54216ecfc01e31d3155750522d0825a7d167a"} Mar 21 04:43:51 crc kubenswrapper[4839]: I0321 04:43:51.900725 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-qgdlf" podStartSLOduration=6.628246323 podStartE2EDuration="17.900699923s" podCreationTimestamp="2026-03-21 04:43:34 +0000 UTC" firstStartedPulling="2026-03-21 04:43:40.381920842 +0000 UTC m=+1224.709707528" lastFinishedPulling="2026-03-21 04:43:51.654374452 +0000 UTC m=+1235.982161128" observedRunningTime="2026-03-21 04:43:51.883924764 +0000 UTC m=+1236.211711450" watchObservedRunningTime="2026-03-21 04:43:51.900699923 +0000 UTC m=+1236.228486599" Mar 21 04:43:52 crc kubenswrapper[4839]: I0321 04:43:52.879737 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"4e5b10c70eee61e441b4838fa7c2c853ba2b08fd634d9e172c49c87b2faeef46"} Mar 21 04:43:52 crc kubenswrapper[4839]: I0321 04:43:52.880066 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"a9b37f3384b4b0eae51f61d02514ddd05f5f508134f5482ab9d86baa4db5f11a"} Mar 21 04:43:52 crc kubenswrapper[4839]: I0321 04:43:52.880078 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"a86c7925f0c57c463d27dda933e80a017e3b6ba16431475b33a0c1032c17688d"} Mar 21 04:43:52 crc kubenswrapper[4839]: I0321 04:43:52.880087 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"299e07809867521cdc6adb7c8e489a9dcd37b132e6dd10141d563e7b532c1da8"} Mar 21 04:43:54 crc kubenswrapper[4839]: I0321 04:43:54.901083 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"4f63cc95cc378ebf27d8d9ee231175db976cde31102666f063f8234adf98aef0"} Mar 21 04:43:55 crc kubenswrapper[4839]: I0321 04:43:55.917962 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"2543c12a287a398f236910833bc11731c9e0ee7068b40132c0ab3acbd115509f"} Mar 21 04:43:55 crc kubenswrapper[4839]: I0321 04:43:55.918344 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"06b61ebed8cd68ee21289b45b2c6605ec597911e951f4357b7aaba905adc5461"} Mar 21 04:43:55 crc kubenswrapper[4839]: I0321 04:43:55.918359 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"4d03449d96fdc2c9659cd49ad827e80ab4c39587576b6f127c94e3590e3edd32"} Mar 21 04:43:56 crc kubenswrapper[4839]: I0321 04:43:56.929552 4839 generic.go:334] "Generic (PLEG): container finished" podID="bc21c34c-13c1-4733-9013-0cfd304b179c" containerID="412e0d9615c7dcab7728f617fda54216ecfc01e31d3155750522d0825a7d167a" exitCode=0 Mar 21 04:43:56 crc kubenswrapper[4839]: I0321 04:43:56.929689 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qgdlf" event={"ID":"bc21c34c-13c1-4733-9013-0cfd304b179c","Type":"ContainerDied","Data":"412e0d9615c7dcab7728f617fda54216ecfc01e31d3155750522d0825a7d167a"} Mar 21 04:43:56 crc kubenswrapper[4839]: I0321 04:43:56.936088 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"18fafb2b7cedaeb91e2ed9cae4eb4702c1bb5391ffb3b7f246e498bc73062cf8"} Mar 21 04:43:56 crc kubenswrapper[4839]: I0321 04:43:56.936133 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"92047ef2b6fad140e351aba2e4e1717c008862f404cab80c606272eecccccf5e"} Mar 21 04:43:57 crc kubenswrapper[4839]: I0321 04:43:57.961715 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"9dc2a480548652ec0f569c2ff8001cc12924ece46aa719e945a332d0283f8c51"} Mar 21 04:43:57 crc kubenswrapper[4839]: I0321 04:43:57.962063 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"5ad2a70ac0163e6e775fd4e3f93931ef5d151c0c8130145323cb39ebf1423ddd"} Mar 21 04:43:57 crc kubenswrapper[4839]: I0321 04:43:57.962073 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"28bcd0b01ffcf842c57eeb0a6600606662b8b4c1d2547451d70ae901579d7e66"} Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.391497 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.501264 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcwg2\" (UniqueName: \"kubernetes.io/projected/bc21c34c-13c1-4733-9013-0cfd304b179c-kube-api-access-vcwg2\") pod \"bc21c34c-13c1-4733-9013-0cfd304b179c\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.502007 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-combined-ca-bundle\") pod \"bc21c34c-13c1-4733-9013-0cfd304b179c\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.502095 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-config-data\") pod \"bc21c34c-13c1-4733-9013-0cfd304b179c\" (UID: \"bc21c34c-13c1-4733-9013-0cfd304b179c\") " Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.506221 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc21c34c-13c1-4733-9013-0cfd304b179c-kube-api-access-vcwg2" (OuterVolumeSpecName: "kube-api-access-vcwg2") pod "bc21c34c-13c1-4733-9013-0cfd304b179c" (UID: "bc21c34c-13c1-4733-9013-0cfd304b179c"). InnerVolumeSpecName "kube-api-access-vcwg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.526550 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc21c34c-13c1-4733-9013-0cfd304b179c" (UID: "bc21c34c-13c1-4733-9013-0cfd304b179c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.550443 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-config-data" (OuterVolumeSpecName: "config-data") pod "bc21c34c-13c1-4733-9013-0cfd304b179c" (UID: "bc21c34c-13c1-4733-9013-0cfd304b179c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.604553 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcwg2\" (UniqueName: \"kubernetes.io/projected/bc21c34c-13c1-4733-9013-0cfd304b179c-kube-api-access-vcwg2\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.604611 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.604623 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc21c34c-13c1-4733-9013-0cfd304b179c-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.970556 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qgdlf" event={"ID":"bc21c34c-13c1-4733-9013-0cfd304b179c","Type":"ContainerDied","Data":"5aa954aabfea506f0a5478f8c3a3555f34687b1de83455cee6f271375bfc4c66"} Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.970620 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5aa954aabfea506f0a5478f8c3a3555f34687b1de83455cee6f271375bfc4c66" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.970690 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qgdlf" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.985786 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"583bd871e4062ae14fed460f55bea0ef1edad5015242520ba12620d807ed1490"} Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:58.985824 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9848d2f0-c562-4b2a-bd1c-cd91c6754079","Type":"ContainerStarted","Data":"16691c3afc23c06b01006fa62e6582e6657a5a82dd9dce41a2cfa310cd369135"} Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.033067 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=45.396027161 podStartE2EDuration="1m1.033048622s" podCreationTimestamp="2026-03-21 04:42:58 +0000 UTC" firstStartedPulling="2026-03-21 04:43:40.539434128 +0000 UTC m=+1224.867220804" lastFinishedPulling="2026-03-21 04:43:56.176455589 +0000 UTC m=+1240.504242265" observedRunningTime="2026-03-21 04:43:59.021028956 +0000 UTC m=+1243.348815632" watchObservedRunningTime="2026-03-21 04:43:59.033048622 +0000 UTC m=+1243.360835298" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.320605 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-ch994"] Mar 21 04:43:59 crc kubenswrapper[4839]: E0321 04:43:59.321001 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d" containerName="mariadb-database-create" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321023 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d" containerName="mariadb-database-create" Mar 21 04:43:59 crc kubenswrapper[4839]: E0321 04:43:59.321036 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b567c69c-d110-4ab2-aaf7-da82f0e72cc3" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321045 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b567c69c-d110-4ab2-aaf7-da82f0e72cc3" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: E0321 04:43:59.321056 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8ad856-1b19-4b1c-8124-2e316dd567ee" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321064 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8ad856-1b19-4b1c-8124-2e316dd567ee" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: E0321 04:43:59.321085 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae4cf7f5-74ed-45d7-ace7-24ada744db6c" containerName="mariadb-database-create" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321107 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae4cf7f5-74ed-45d7-ace7-24ada744db6c" containerName="mariadb-database-create" Mar 21 04:43:59 crc kubenswrapper[4839]: E0321 04:43:59.321121 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f816daf8-a9c7-4e99-a622-2f9bee7d203a" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321127 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f816daf8-a9c7-4e99-a622-2f9bee7d203a" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: E0321 04:43:59.321137 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc21c34c-13c1-4733-9013-0cfd304b179c" containerName="keystone-db-sync" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321143 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc21c34c-13c1-4733-9013-0cfd304b179c" containerName="keystone-db-sync" Mar 21 04:43:59 crc kubenswrapper[4839]: E0321 04:43:59.321149 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7dfdbcf-7830-4f8d-a165-119fe80d999a" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321155 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7dfdbcf-7830-4f8d-a165-119fe80d999a" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: E0321 04:43:59.321164 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34a240db-9587-446e-af12-a44b87b1a3ac" containerName="mariadb-database-create" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321170 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a240db-9587-446e-af12-a44b87b1a3ac" containerName="mariadb-database-create" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321353 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7dfdbcf-7830-4f8d-a165-119fe80d999a" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321374 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc21c34c-13c1-4733-9013-0cfd304b179c" containerName="keystone-db-sync" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321389 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b567c69c-d110-4ab2-aaf7-da82f0e72cc3" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321399 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="34a240db-9587-446e-af12-a44b87b1a3ac" containerName="mariadb-database-create" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321409 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae4cf7f5-74ed-45d7-ace7-24ada744db6c" containerName="mariadb-database-create" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321420 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d" containerName="mariadb-database-create" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321433 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f816daf8-a9c7-4e99-a622-2f9bee7d203a" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.321455 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c8ad856-1b19-4b1c-8124-2e316dd567ee" containerName="mariadb-account-create-update" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.322238 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.342359 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-ch994"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.368986 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-5rr4j"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.370243 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.398820 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.399340 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pzsvm" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.399481 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.399655 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.399866 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.413661 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5rr4j"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.415545 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j28tz\" (UniqueName: \"kubernetes.io/projected/5e86a461-8b9c-4850-b084-5a242058db02-kube-api-access-j28tz\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.415657 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-config\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.415698 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.415766 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.415892 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517218 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-config\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517524 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517585 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517609 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-scripts\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517643 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-config-data\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517679 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-combined-ca-bundle\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517726 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-fernet-keys\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517749 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517768 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-credential-keys\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517786 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqg7c\" (UniqueName: \"kubernetes.io/projected/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-kube-api-access-rqg7c\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.517807 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j28tz\" (UniqueName: \"kubernetes.io/projected/5e86a461-8b9c-4850-b084-5a242058db02-kube-api-access-j28tz\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.518114 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-config\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.518621 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.518731 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.519194 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.571173 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j28tz\" (UniqueName: \"kubernetes.io/projected/5e86a461-8b9c-4850-b084-5a242058db02-kube-api-access-j28tz\") pod \"dnsmasq-dns-5c9d85d47c-ch994\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.581183 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-qfjms"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.582197 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.593117 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-4bb6p" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.593304 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.593405 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.620450 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-fernet-keys\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.620507 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-credential-keys\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.620525 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqg7c\" (UniqueName: \"kubernetes.io/projected/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-kube-api-access-rqg7c\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.620631 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-scripts\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.620664 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-config-data\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.620694 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-combined-ca-bundle\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.636239 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-fernet-keys\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.638000 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-combined-ca-bundle\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.642495 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.642919 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-config-data\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.643946 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-scripts\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.646185 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-ch994"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.661526 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-credential-keys\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.706917 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qfjms"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.727358 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqg7c\" (UniqueName: \"kubernetes.io/projected/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-kube-api-access-rqg7c\") pod \"keystone-bootstrap-5rr4j\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.735420 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-scripts\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.735466 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-combined-ca-bundle\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.735533 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6mpt\" (UniqueName: \"kubernetes.io/projected/6000d2d4-e84a-443f-9094-ab999541331d-kube-api-access-g6mpt\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.735586 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6000d2d4-e84a-443f-9094-ab999541331d-etc-machine-id\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.735607 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-db-sync-config-data\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.735635 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-config-data\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.740646 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-554fbfcbdf-wqcc5"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.742348 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.754527 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-pn6kj" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.754695 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.754824 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.754992 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.791374 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-nm9t5"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.792620 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.811762 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.813425 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.818232 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-5mrkq" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.838620 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-nm9t5"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.841993 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6mpt\" (UniqueName: \"kubernetes.io/projected/6000d2d4-e84a-443f-9094-ab999541331d-kube-api-access-g6mpt\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842033 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-scripts\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842084 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6000d2d4-e84a-443f-9094-ab999541331d-etc-machine-id\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842106 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-db-sync-config-data\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842134 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/02db0b32-3683-4d02-b645-3cea2cd59b7d-horizon-secret-key\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842163 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-config-data\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842196 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-config-data\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842241 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-scripts\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842263 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-combined-ca-bundle\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842299 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02db0b32-3683-4d02-b645-3cea2cd59b7d-logs\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842327 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ccwx\" (UniqueName: \"kubernetes.io/projected/02db0b32-3683-4d02-b645-3cea2cd59b7d-kube-api-access-5ccwx\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.842714 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6000d2d4-e84a-443f-9094-ab999541331d-etc-machine-id\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.869734 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-combined-ca-bundle\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.869818 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-db-sync-config-data\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.888433 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6mpt\" (UniqueName: \"kubernetes.io/projected/6000d2d4-e84a-443f-9094-ab999541331d-kube-api-access-g6mpt\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.916599 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-scripts\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.921698 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b868669f-g4f92"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.928970 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.936912 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.940312 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-wdddk"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.942439 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wdddk" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.944010 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02db0b32-3683-4d02-b645-3cea2cd59b7d-logs\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.944088 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ccwx\" (UniqueName: \"kubernetes.io/projected/02db0b32-3683-4d02-b645-3cea2cd59b7d-kube-api-access-5ccwx\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.944131 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-scripts\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.944182 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/02db0b32-3683-4d02-b645-3cea2cd59b7d-horizon-secret-key\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.944261 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-config-data\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.944307 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-combined-ca-bundle\") pod \"neutron-db-sync-nm9t5\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.944331 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhrt9\" (UniqueName: \"kubernetes.io/projected/625a99bd-bc01-400e-8e9c-1f5eff390466-kube-api-access-zhrt9\") pod \"neutron-db-sync-nm9t5\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.944358 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-config\") pod \"neutron-db-sync-nm9t5\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.945320 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02db0b32-3683-4d02-b645-3cea2cd59b7d-logs\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.946296 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-config-data\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.946384 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-config-data\") pod \"cinder-db-sync-qfjms\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " pod="openstack/cinder-db-sync-qfjms" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.946932 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-scripts\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.957503 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.957985 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.958273 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qhlcg" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.958514 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-554fbfcbdf-wqcc5"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.959025 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/02db0b32-3683-4d02-b645-3cea2cd59b7d-horizon-secret-key\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.980075 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-wdddk"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.986841 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-g4f92"] Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.993434 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ccwx\" (UniqueName: \"kubernetes.io/projected/02db0b32-3683-4d02-b645-3cea2cd59b7d-kube-api-access-5ccwx\") pod \"horizon-554fbfcbdf-wqcc5\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:43:59 crc kubenswrapper[4839]: I0321 04:43:59.993510 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-t8kxj"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.002466 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.025041 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-g4f92"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.025166 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.031025 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qnmpn" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.031264 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.034099 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-t8kxj"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050530 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-scripts\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050589 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-config\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050611 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-combined-ca-bundle\") pod \"neutron-db-sync-nm9t5\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050631 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r6lw\" (UniqueName: \"kubernetes.io/projected/9fec7a31-49df-4e3c-9266-8c21d7622445-kube-api-access-5r6lw\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050653 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhrt9\" (UniqueName: \"kubernetes.io/projected/625a99bd-bc01-400e-8e9c-1f5eff390466-kube-api-access-zhrt9\") pod \"neutron-db-sync-nm9t5\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050678 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-config\") pod \"neutron-db-sync-nm9t5\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050700 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050717 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-config-data\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050738 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-combined-ca-bundle\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050761 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050779 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050794 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdb5v\" (UniqueName: \"kubernetes.io/projected/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-kube-api-access-tdb5v\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050815 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-svc\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.050834 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-logs\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.064637 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-config\") pod \"neutron-db-sync-nm9t5\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.065366 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-combined-ca-bundle\") pod \"neutron-db-sync-nm9t5\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.079189 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhrt9\" (UniqueName: \"kubernetes.io/projected/625a99bd-bc01-400e-8e9c-1f5eff390466-kube-api-access-zhrt9\") pod \"neutron-db-sync-nm9t5\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.095649 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.100464 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-mr2ng"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.102773 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.112795 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:44:00 crc kubenswrapper[4839]: E0321 04:44:00.143437 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-5r6lw ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5b868669f-g4f92" podUID="9fec7a31-49df-4e3c-9266-8c21d7622445" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152380 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkzvv\" (UniqueName: \"kubernetes.io/projected/6d0e1745-6e0b-475c-a1de-d049018abea6-kube-api-access-hkzvv\") pod \"barbican-db-sync-t8kxj\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152435 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-scripts\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152486 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-db-sync-config-data\") pod \"barbican-db-sync-t8kxj\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152518 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-config\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152591 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r6lw\" (UniqueName: \"kubernetes.io/projected/9fec7a31-49df-4e3c-9266-8c21d7622445-kube-api-access-5r6lw\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152649 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-config-data\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152671 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152709 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-combined-ca-bundle\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152750 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152775 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152799 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdb5v\" (UniqueName: \"kubernetes.io/projected/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-kube-api-access-tdb5v\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152829 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-svc\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152861 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-logs\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.152914 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-combined-ca-bundle\") pod \"barbican-db-sync-t8kxj\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.154041 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.154131 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-config\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.154771 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.155372 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-logs\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.156078 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-svc\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.157003 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.158256 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-scripts\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.159207 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-combined-ca-bundle\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.159265 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-mr2ng"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.160334 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-config-data\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.170959 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.175054 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.177207 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.181595 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdb5v\" (UniqueName: \"kubernetes.io/projected/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-kube-api-access-tdb5v\") pod \"placement-db-sync-wdddk\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.181741 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.188111 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r6lw\" (UniqueName: \"kubernetes.io/projected/9fec7a31-49df-4e3c-9266-8c21d7622445-kube-api-access-5r6lw\") pod \"dnsmasq-dns-5b868669f-g4f92\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.197129 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.215537 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c9d7f5-72d27"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.217331 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.225602 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qfjms" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.239526 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c9d7f5-72d27"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256328 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-config\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256404 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-config-data\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256441 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-combined-ca-bundle\") pod \"barbican-db-sync-t8kxj\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256526 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-svc\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256578 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-log-httpd\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256610 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256662 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7fdg\" (UniqueName: \"kubernetes.io/projected/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-kube-api-access-n7fdg\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256685 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-scripts\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256725 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-run-httpd\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256750 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256823 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkzvv\" (UniqueName: \"kubernetes.io/projected/6d0e1745-6e0b-475c-a1de-d049018abea6-kube-api-access-hkzvv\") pod \"barbican-db-sync-t8kxj\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256886 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-db-sync-config-data\") pod \"barbican-db-sync-t8kxj\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.256915 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.257070 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.257103 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.257155 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zddh\" (UniqueName: \"kubernetes.io/projected/6c266726-5bfd-4519-bdd5-9db7f6a77df4-kube-api-access-5zddh\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.275024 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-combined-ca-bundle\") pod \"barbican-db-sync-t8kxj\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.279225 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkzvv\" (UniqueName: \"kubernetes.io/projected/6d0e1745-6e0b-475c-a1de-d049018abea6-kube-api-access-hkzvv\") pod \"barbican-db-sync-t8kxj\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.283186 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-db-sync-config-data\") pod \"barbican-db-sync-t8kxj\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.307930 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567804-jn7hw"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.309693 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567804-jn7hw" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.312362 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.312650 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.313285 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.356579 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567804-jn7hw"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.370787 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7fdg\" (UniqueName: \"kubernetes.io/projected/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-kube-api-access-n7fdg\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.370838 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-scripts\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.370888 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-run-httpd\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.370904 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.370968 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t7wq\" (UniqueName: \"kubernetes.io/projected/3193915f-60d3-4c8e-aa15-858213ce011c-kube-api-access-6t7wq\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371011 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371082 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371110 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371159 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zddh\" (UniqueName: \"kubernetes.io/projected/6c266726-5bfd-4519-bdd5-9db7f6a77df4-kube-api-access-5zddh\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371239 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-config-data\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371561 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-config\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371619 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-config-data\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371637 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-scripts\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371669 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3193915f-60d3-4c8e-aa15-858213ce011c-logs\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371689 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3193915f-60d3-4c8e-aa15-858213ce011c-horizon-secret-key\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371717 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-svc\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371748 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-log-httpd\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.371772 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.372441 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-run-httpd\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.373346 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.373546 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-config\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.374752 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.376024 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-svc\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.376759 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.377471 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-log-httpd\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.381097 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.391351 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-scripts\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.394241 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.398538 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-config-data\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.406205 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7fdg\" (UniqueName: \"kubernetes.io/projected/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-kube-api-access-n7fdg\") pod \"dnsmasq-dns-cf78879c9-mr2ng\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.410413 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zddh\" (UniqueName: \"kubernetes.io/projected/6c266726-5bfd-4519-bdd5-9db7f6a77df4-kube-api-access-5zddh\") pod \"ceilometer-0\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.459169 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.476243 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t7wq\" (UniqueName: \"kubernetes.io/projected/3193915f-60d3-4c8e-aa15-858213ce011c-kube-api-access-6t7wq\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.476372 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-config-data\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.476425 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrqtc\" (UniqueName: \"kubernetes.io/projected/117f0438-5ab3-4616-b574-c5bbc43e8ac9-kube-api-access-rrqtc\") pod \"auto-csr-approver-29567804-jn7hw\" (UID: \"117f0438-5ab3-4616-b574-c5bbc43e8ac9\") " pod="openshift-infra/auto-csr-approver-29567804-jn7hw" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.476477 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-scripts\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.476507 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3193915f-60d3-4c8e-aa15-858213ce011c-logs\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.476532 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3193915f-60d3-4c8e-aa15-858213ce011c-horizon-secret-key\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.478306 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-scripts\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.478857 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3193915f-60d3-4c8e-aa15-858213ce011c-logs\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.478721 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-config-data\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.478968 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.484532 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3193915f-60d3-4c8e-aa15-858213ce011c-horizon-secret-key\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.489915 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.504067 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t7wq\" (UniqueName: \"kubernetes.io/projected/3193915f-60d3-4c8e-aa15-858213ce011c-kube-api-access-6t7wq\") pod \"horizon-7c9d7f5-72d27\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.516835 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.553686 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.571387 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-ch994"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.578608 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrqtc\" (UniqueName: \"kubernetes.io/projected/117f0438-5ab3-4616-b574-c5bbc43e8ac9-kube-api-access-rrqtc\") pod \"auto-csr-approver-29567804-jn7hw\" (UID: \"117f0438-5ab3-4616-b574-c5bbc43e8ac9\") " pod="openshift-infra/auto-csr-approver-29567804-jn7hw" Mar 21 04:44:00 crc kubenswrapper[4839]: W0321 04:44:00.611280 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e86a461_8b9c_4850_b084_5a242058db02.slice/crio-3695c4efcb7ed7c99b03416f2c3b5313f86380f8412fb12414d41e72c87f8386 WatchSource:0}: Error finding container 3695c4efcb7ed7c99b03416f2c3b5313f86380f8412fb12414d41e72c87f8386: Status 404 returned error can't find the container with id 3695c4efcb7ed7c99b03416f2c3b5313f86380f8412fb12414d41e72c87f8386 Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.617415 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrqtc\" (UniqueName: \"kubernetes.io/projected/117f0438-5ab3-4616-b574-c5bbc43e8ac9-kube-api-access-rrqtc\") pod \"auto-csr-approver-29567804-jn7hw\" (UID: \"117f0438-5ab3-4616-b574-c5bbc43e8ac9\") " pod="openshift-infra/auto-csr-approver-29567804-jn7hw" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.631287 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567804-jn7hw" Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.702358 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5rr4j"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.808387 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-nm9t5"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.941648 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qfjms"] Mar 21 04:44:00 crc kubenswrapper[4839]: I0321 04:44:00.965155 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-554fbfcbdf-wqcc5"] Mar 21 04:44:00 crc kubenswrapper[4839]: W0321 04:44:00.996901 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod625a99bd_bc01_400e_8e9c_1f5eff390466.slice/crio-829be773cbac605730f273e58ffebe5c5615f40baf855f3c2212fe6c649c7cf3 WatchSource:0}: Error finding container 829be773cbac605730f273e58ffebe5c5615f40baf855f3c2212fe6c649c7cf3: Status 404 returned error can't find the container with id 829be773cbac605730f273e58ffebe5c5615f40baf855f3c2212fe6c649c7cf3 Mar 21 04:44:01 crc kubenswrapper[4839]: W0321 04:44:01.006143 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02db0b32_3683_4d02_b645_3cea2cd59b7d.slice/crio-8e3dd636e6fac659887299acf1ac0e45d1e3d4824f9b0e8c44ea6e8f2b5429e5 WatchSource:0}: Error finding container 8e3dd636e6fac659887299acf1ac0e45d1e3d4824f9b0e8c44ea6e8f2b5429e5: Status 404 returned error can't find the container with id 8e3dd636e6fac659887299acf1ac0e45d1e3d4824f9b0e8c44ea6e8f2b5429e5 Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.101020 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" event={"ID":"5e86a461-8b9c-4850-b084-5a242058db02","Type":"ContainerStarted","Data":"3695c4efcb7ed7c99b03416f2c3b5313f86380f8412fb12414d41e72c87f8386"} Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.112023 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5rr4j" event={"ID":"848aa53a-bd67-4733-aad7-6ac0f6fc0a15","Type":"ContainerStarted","Data":"70f7b322b7c3ad74c3e8a9620d17f9758e57775147f74650f1a626aa0f7a8463"} Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.118817 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nm9t5" event={"ID":"625a99bd-bc01-400e-8e9c-1f5eff390466","Type":"ContainerStarted","Data":"829be773cbac605730f273e58ffebe5c5615f40baf855f3c2212fe6c649c7cf3"} Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.166621 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-554fbfcbdf-wqcc5" event={"ID":"02db0b32-3683-4d02-b645-3cea2cd59b7d","Type":"ContainerStarted","Data":"8e3dd636e6fac659887299acf1ac0e45d1e3d4824f9b0e8c44ea6e8f2b5429e5"} Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.171677 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qfjms" event={"ID":"6000d2d4-e84a-443f-9094-ab999541331d","Type":"ContainerStarted","Data":"4d5cb1d53067040b399cf367f961d70a4e98d3e793e42e6da997085ddb0d9688"} Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.171749 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.229670 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.304839 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-svc\") pod \"9fec7a31-49df-4e3c-9266-8c21d7622445\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.304886 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-swift-storage-0\") pod \"9fec7a31-49df-4e3c-9266-8c21d7622445\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.304984 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r6lw\" (UniqueName: \"kubernetes.io/projected/9fec7a31-49df-4e3c-9266-8c21d7622445-kube-api-access-5r6lw\") pod \"9fec7a31-49df-4e3c-9266-8c21d7622445\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.305072 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-config\") pod \"9fec7a31-49df-4e3c-9266-8c21d7622445\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.305114 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-nb\") pod \"9fec7a31-49df-4e3c-9266-8c21d7622445\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.305176 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-sb\") pod \"9fec7a31-49df-4e3c-9266-8c21d7622445\" (UID: \"9fec7a31-49df-4e3c-9266-8c21d7622445\") " Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.305526 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9fec7a31-49df-4e3c-9266-8c21d7622445" (UID: "9fec7a31-49df-4e3c-9266-8c21d7622445"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.306215 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9fec7a31-49df-4e3c-9266-8c21d7622445" (UID: "9fec7a31-49df-4e3c-9266-8c21d7622445"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.306702 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-config" (OuterVolumeSpecName: "config") pod "9fec7a31-49df-4e3c-9266-8c21d7622445" (UID: "9fec7a31-49df-4e3c-9266-8c21d7622445"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.307223 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9fec7a31-49df-4e3c-9266-8c21d7622445" (UID: "9fec7a31-49df-4e3c-9266-8c21d7622445"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.307476 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9fec7a31-49df-4e3c-9266-8c21d7622445" (UID: "9fec7a31-49df-4e3c-9266-8c21d7622445"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.307916 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.307941 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.307953 4839 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.307962 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.307970 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fec7a31-49df-4e3c-9266-8c21d7622445-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.332930 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-wdddk"] Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.335793 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fec7a31-49df-4e3c-9266-8c21d7622445-kube-api-access-5r6lw" (OuterVolumeSpecName: "kube-api-access-5r6lw") pod "9fec7a31-49df-4e3c-9266-8c21d7622445" (UID: "9fec7a31-49df-4e3c-9266-8c21d7622445"). InnerVolumeSpecName "kube-api-access-5r6lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.409401 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r6lw\" (UniqueName: \"kubernetes.io/projected/9fec7a31-49df-4e3c-9266-8c21d7622445-kube-api-access-5r6lw\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.635769 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-mr2ng"] Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.666943 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.679190 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-t8kxj"] Mar 21 04:44:01 crc kubenswrapper[4839]: W0321 04:44:01.687288 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d0e1745_6e0b_475c_a1de_d049018abea6.slice/crio-00637144ea664e135a3a03c08667ad9ad9e5c84e3814ae65ec02c62e19d9549d WatchSource:0}: Error finding container 00637144ea664e135a3a03c08667ad9ad9e5c84e3814ae65ec02c62e19d9549d: Status 404 returned error can't find the container with id 00637144ea664e135a3a03c08667ad9ad9e5c84e3814ae65ec02c62e19d9549d Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.847831 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567804-jn7hw"] Mar 21 04:44:01 crc kubenswrapper[4839]: I0321 04:44:01.889509 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c9d7f5-72d27"] Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.120595 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.163916 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-554fbfcbdf-wqcc5"] Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.163974 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-75c94899fc-bkxlk"] Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.190065 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.210238 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75c94899fc-bkxlk"] Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.229936 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c9d7f5-72d27" event={"ID":"3193915f-60d3-4c8e-aa15-858213ce011c","Type":"ContainerStarted","Data":"6e4a51f272197d48dcbc81a0aecd9739f163dd5e41504342f75386e4fcf464f5"} Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.235008 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567804-jn7hw" event={"ID":"117f0438-5ab3-4616-b574-c5bbc43e8ac9","Type":"ContainerStarted","Data":"6abcdf109ee136fd47cf7c735b5b48052a42904b27a3f0d33fd3e2a18c075320"} Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.236886 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-logs\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.236956 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-config-data\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.237079 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-scripts\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.237115 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9ntv\" (UniqueName: \"kubernetes.io/projected/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-kube-api-access-s9ntv\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.237147 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-horizon-secret-key\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.253528 4839 generic.go:334] "Generic (PLEG): container finished" podID="5e86a461-8b9c-4850-b084-5a242058db02" containerID="85588b4178127cda15e4b9075c4a1854a46c41932df5300f78b062a7e2468918" exitCode=0 Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.253620 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" event={"ID":"5e86a461-8b9c-4850-b084-5a242058db02","Type":"ContainerDied","Data":"85588b4178127cda15e4b9075c4a1854a46c41932df5300f78b062a7e2468918"} Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.257295 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c266726-5bfd-4519-bdd5-9db7f6a77df4","Type":"ContainerStarted","Data":"41ed81fbf037f8ebe50fd1cd4bb84f9e7c73f61ee6cb668dca265d806ca14d96"} Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.263237 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" event={"ID":"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb","Type":"ContainerStarted","Data":"52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349"} Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.263288 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" event={"ID":"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb","Type":"ContainerStarted","Data":"2dbe5499f5ce6b46711307e191213d3557376716206b4f9aec95cbff6dcd4f72"} Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.293610 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5rr4j" event={"ID":"848aa53a-bd67-4733-aad7-6ac0f6fc0a15","Type":"ContainerStarted","Data":"2233fa3f3ad560ada373befd98764d7c67680bcb094c6c63415e8ef4dc05b7f7"} Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.306419 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nm9t5" event={"ID":"625a99bd-bc01-400e-8e9c-1f5eff390466","Type":"ContainerStarted","Data":"dfcec3a2306ecb1c0b0e9a1bd05577683dcbd7efc3319d4ee942c6e22862d913"} Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.308868 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-t8kxj" event={"ID":"6d0e1745-6e0b-475c-a1de-d049018abea6","Type":"ContainerStarted","Data":"00637144ea664e135a3a03c08667ad9ad9e5c84e3814ae65ec02c62e19d9549d"} Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.335335 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-g4f92" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.335682 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wdddk" event={"ID":"e6e87cbd-1f46-4fa0-9529-8250f9fee21c","Type":"ContainerStarted","Data":"e10dee2b21cfdb75da16c639d865bd8e8d3823159b603d6fda5a875f34a0fb47"} Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.348004 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-scripts\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.348141 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9ntv\" (UniqueName: \"kubernetes.io/projected/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-kube-api-access-s9ntv\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.348228 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-horizon-secret-key\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.348482 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-logs\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.348683 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-config-data\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.354543 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-scripts\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.355297 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-config-data\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.357817 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-logs\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.372331 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-horizon-secret-key\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.383614 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9ntv\" (UniqueName: \"kubernetes.io/projected/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-kube-api-access-s9ntv\") pod \"horizon-75c94899fc-bkxlk\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.389726 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-5rr4j" podStartSLOduration=3.389699575 podStartE2EDuration="3.389699575s" podCreationTimestamp="2026-03-21 04:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:02.37233443 +0000 UTC m=+1246.700121126" watchObservedRunningTime="2026-03-21 04:44:02.389699575 +0000 UTC m=+1246.717486261" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.461796 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-nm9t5" podStartSLOduration=3.4617797120000002 podStartE2EDuration="3.461779712s" podCreationTimestamp="2026-03-21 04:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:02.412130793 +0000 UTC m=+1246.739917469" watchObservedRunningTime="2026-03-21 04:44:02.461779712 +0000 UTC m=+1246.789566388" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.504217 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-g4f92"] Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.516300 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-g4f92"] Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.533374 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.665685 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.766846 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j28tz\" (UniqueName: \"kubernetes.io/projected/5e86a461-8b9c-4850-b084-5a242058db02-kube-api-access-j28tz\") pod \"5e86a461-8b9c-4850-b084-5a242058db02\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.766905 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-dns-svc\") pod \"5e86a461-8b9c-4850-b084-5a242058db02\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.766930 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-config\") pod \"5e86a461-8b9c-4850-b084-5a242058db02\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.767000 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-nb\") pod \"5e86a461-8b9c-4850-b084-5a242058db02\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.767042 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-sb\") pod \"5e86a461-8b9c-4850-b084-5a242058db02\" (UID: \"5e86a461-8b9c-4850-b084-5a242058db02\") " Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.773962 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e86a461-8b9c-4850-b084-5a242058db02-kube-api-access-j28tz" (OuterVolumeSpecName: "kube-api-access-j28tz") pod "5e86a461-8b9c-4850-b084-5a242058db02" (UID: "5e86a461-8b9c-4850-b084-5a242058db02"). InnerVolumeSpecName "kube-api-access-j28tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.824211 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e86a461-8b9c-4850-b084-5a242058db02" (UID: "5e86a461-8b9c-4850-b084-5a242058db02"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.824259 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5e86a461-8b9c-4850-b084-5a242058db02" (UID: "5e86a461-8b9c-4850-b084-5a242058db02"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.824794 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-config" (OuterVolumeSpecName: "config") pod "5e86a461-8b9c-4850-b084-5a242058db02" (UID: "5e86a461-8b9c-4850-b084-5a242058db02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.828514 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5e86a461-8b9c-4850-b084-5a242058db02" (UID: "5e86a461-8b9c-4850-b084-5a242058db02"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.871653 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.872003 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.872017 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j28tz\" (UniqueName: \"kubernetes.io/projected/5e86a461-8b9c-4850-b084-5a242058db02-kube-api-access-j28tz\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.872063 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:02 crc kubenswrapper[4839]: I0321 04:44:02.872079 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e86a461-8b9c-4850-b084-5a242058db02-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.190913 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75c94899fc-bkxlk"] Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.363123 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567804-jn7hw" event={"ID":"117f0438-5ab3-4616-b574-c5bbc43e8ac9","Type":"ContainerStarted","Data":"91c89b78e4a205a25af8a93dc758c0974e237fac7942a3cc2a1f6b03e61923e1"} Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.366277 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75c94899fc-bkxlk" event={"ID":"f9b42e2e-3015-4ae1-a3a9-3eb96949b021","Type":"ContainerStarted","Data":"04fe2b7baf42bfa7035c15041a5de93662e87c4359a787bd7c9b47e57eb2a7fa"} Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.376996 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" event={"ID":"5e86a461-8b9c-4850-b084-5a242058db02","Type":"ContainerDied","Data":"3695c4efcb7ed7c99b03416f2c3b5313f86380f8412fb12414d41e72c87f8386"} Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.377042 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-ch994" Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.377076 4839 scope.go:117] "RemoveContainer" containerID="85588b4178127cda15e4b9075c4a1854a46c41932df5300f78b062a7e2468918" Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.389868 4839 generic.go:334] "Generic (PLEG): container finished" podID="e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" containerID="52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349" exitCode=0 Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.389950 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" event={"ID":"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb","Type":"ContainerDied","Data":"52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349"} Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.389988 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" event={"ID":"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb","Type":"ContainerStarted","Data":"14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598"} Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.450309 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" podStartSLOduration=4.450291966 podStartE2EDuration="4.450291966s" podCreationTimestamp="2026-03-21 04:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:03.449838733 +0000 UTC m=+1247.777625429" watchObservedRunningTime="2026-03-21 04:44:03.450291966 +0000 UTC m=+1247.778078642" Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.499814 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-ch994"] Mar 21 04:44:03 crc kubenswrapper[4839]: I0321 04:44:03.523219 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-ch994"] Mar 21 04:44:04 crc kubenswrapper[4839]: I0321 04:44:04.419041 4839 generic.go:334] "Generic (PLEG): container finished" podID="2cc1dfb9-8108-46e5-8dc5-5b555590ecc1" containerID="79604402661ee3c465cb72ff146dbc568553c3204385175c4f68e9dccfa5a6c6" exitCode=0 Mar 21 04:44:04 crc kubenswrapper[4839]: I0321 04:44:04.419145 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ng2tw" event={"ID":"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1","Type":"ContainerDied","Data":"79604402661ee3c465cb72ff146dbc568553c3204385175c4f68e9dccfa5a6c6"} Mar 21 04:44:04 crc kubenswrapper[4839]: I0321 04:44:04.422854 4839 generic.go:334] "Generic (PLEG): container finished" podID="117f0438-5ab3-4616-b574-c5bbc43e8ac9" containerID="91c89b78e4a205a25af8a93dc758c0974e237fac7942a3cc2a1f6b03e61923e1" exitCode=0 Mar 21 04:44:04 crc kubenswrapper[4839]: I0321 04:44:04.422945 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567804-jn7hw" event={"ID":"117f0438-5ab3-4616-b574-c5bbc43e8ac9","Type":"ContainerDied","Data":"91c89b78e4a205a25af8a93dc758c0974e237fac7942a3cc2a1f6b03e61923e1"} Mar 21 04:44:04 crc kubenswrapper[4839]: I0321 04:44:04.423245 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:04 crc kubenswrapper[4839]: I0321 04:44:04.484888 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e86a461-8b9c-4850-b084-5a242058db02" path="/var/lib/kubelet/pods/5e86a461-8b9c-4850-b084-5a242058db02/volumes" Mar 21 04:44:04 crc kubenswrapper[4839]: I0321 04:44:04.485494 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fec7a31-49df-4e3c-9266-8c21d7622445" path="/var/lib/kubelet/pods/9fec7a31-49df-4e3c-9266-8c21d7622445/volumes" Mar 21 04:44:08 crc kubenswrapper[4839]: I0321 04:44:08.891651 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c9d7f5-72d27"] Mar 21 04:44:08 crc kubenswrapper[4839]: I0321 04:44:08.923674 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-84c6c985f8-v5cmh"] Mar 21 04:44:08 crc kubenswrapper[4839]: E0321 04:44:08.924056 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e86a461-8b9c-4850-b084-5a242058db02" containerName="init" Mar 21 04:44:08 crc kubenswrapper[4839]: I0321 04:44:08.924071 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e86a461-8b9c-4850-b084-5a242058db02" containerName="init" Mar 21 04:44:08 crc kubenswrapper[4839]: I0321 04:44:08.924264 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e86a461-8b9c-4850-b084-5a242058db02" containerName="init" Mar 21 04:44:08 crc kubenswrapper[4839]: I0321 04:44:08.925136 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:08 crc kubenswrapper[4839]: I0321 04:44:08.928294 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 21 04:44:08 crc kubenswrapper[4839]: I0321 04:44:08.942271 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84c6c985f8-v5cmh"] Mar 21 04:44:08 crc kubenswrapper[4839]: I0321 04:44:08.970604 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75c94899fc-bkxlk"] Mar 21 04:44:08 crc kubenswrapper[4839]: I0321 04:44:08.999309 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9c97f4dbd-k2scs"] Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.002519 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-tls-certs\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.002580 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-scripts\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.002628 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-config-data\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.002672 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-secret-key\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.002730 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-combined-ca-bundle\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.002753 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmrcp\" (UniqueName: \"kubernetes.io/projected/b3b26c3a-55d5-442a-9c31-187b0aa60f90-kube-api-access-vmrcp\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.002769 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3b26c3a-55d5-442a-9c31-187b0aa60f90-logs\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.004120 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.018949 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9c97f4dbd-k2scs"] Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104428 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-config-data\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104488 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/579308eb-854d-4160-ad35-8677f2d0e634-config-data\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104520 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/579308eb-854d-4160-ad35-8677f2d0e634-horizon-secret-key\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104545 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-secret-key\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104596 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/579308eb-854d-4160-ad35-8677f2d0e634-horizon-tls-certs\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104648 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-combined-ca-bundle\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104667 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/579308eb-854d-4160-ad35-8677f2d0e634-logs\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104686 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmrcp\" (UniqueName: \"kubernetes.io/projected/b3b26c3a-55d5-442a-9c31-187b0aa60f90-kube-api-access-vmrcp\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104699 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/579308eb-854d-4160-ad35-8677f2d0e634-scripts\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104718 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3b26c3a-55d5-442a-9c31-187b0aa60f90-logs\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104750 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-tls-certs\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104766 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/579308eb-854d-4160-ad35-8677f2d0e634-combined-ca-bundle\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104791 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-scripts\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.104814 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r22bn\" (UniqueName: \"kubernetes.io/projected/579308eb-854d-4160-ad35-8677f2d0e634-kube-api-access-r22bn\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.105891 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-config-data\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.108084 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3b26c3a-55d5-442a-9c31-187b0aa60f90-logs\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.108198 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-scripts\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.112703 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-secret-key\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.112776 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-combined-ca-bundle\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.113017 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-tls-certs\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.124855 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmrcp\" (UniqueName: \"kubernetes.io/projected/b3b26c3a-55d5-442a-9c31-187b0aa60f90-kube-api-access-vmrcp\") pod \"horizon-84c6c985f8-v5cmh\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.205930 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/579308eb-854d-4160-ad35-8677f2d0e634-logs\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.205986 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/579308eb-854d-4160-ad35-8677f2d0e634-scripts\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.206039 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/579308eb-854d-4160-ad35-8677f2d0e634-combined-ca-bundle\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.206085 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r22bn\" (UniqueName: \"kubernetes.io/projected/579308eb-854d-4160-ad35-8677f2d0e634-kube-api-access-r22bn\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.206135 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/579308eb-854d-4160-ad35-8677f2d0e634-config-data\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.206171 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/579308eb-854d-4160-ad35-8677f2d0e634-horizon-secret-key\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.206204 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/579308eb-854d-4160-ad35-8677f2d0e634-horizon-tls-certs\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.207184 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/579308eb-854d-4160-ad35-8677f2d0e634-scripts\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.207519 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/579308eb-854d-4160-ad35-8677f2d0e634-config-data\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.208774 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/579308eb-854d-4160-ad35-8677f2d0e634-logs\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.210679 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/579308eb-854d-4160-ad35-8677f2d0e634-horizon-tls-certs\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.211121 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/579308eb-854d-4160-ad35-8677f2d0e634-combined-ca-bundle\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.212790 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/579308eb-854d-4160-ad35-8677f2d0e634-horizon-secret-key\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.235584 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r22bn\" (UniqueName: \"kubernetes.io/projected/579308eb-854d-4160-ad35-8677f2d0e634-kube-api-access-r22bn\") pod \"horizon-9c97f4dbd-k2scs\" (UID: \"579308eb-854d-4160-ad35-8677f2d0e634\") " pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.261134 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:09 crc kubenswrapper[4839]: I0321 04:44:09.327950 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:10 crc kubenswrapper[4839]: I0321 04:44:10.491763 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:10 crc kubenswrapper[4839]: I0321 04:44:10.546597 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zkqb7"] Mar 21 04:44:10 crc kubenswrapper[4839]: I0321 04:44:10.546874 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" podUID="67dd1633-1450-4153-b0af-b6887f61944c" containerName="dnsmasq-dns" containerID="cri-o://66a460b182805c08827a7b4f6980d98fea84c8290c7b4fe1cb071b3630a6c029" gracePeriod=10 Mar 21 04:44:11 crc kubenswrapper[4839]: I0321 04:44:11.487516 4839 generic.go:334] "Generic (PLEG): container finished" podID="67dd1633-1450-4153-b0af-b6887f61944c" containerID="66a460b182805c08827a7b4f6980d98fea84c8290c7b4fe1cb071b3630a6c029" exitCode=0 Mar 21 04:44:11 crc kubenswrapper[4839]: I0321 04:44:11.487561 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" event={"ID":"67dd1633-1450-4153-b0af-b6887f61944c","Type":"ContainerDied","Data":"66a460b182805c08827a7b4f6980d98fea84c8290c7b4fe1cb071b3630a6c029"} Mar 21 04:44:13 crc kubenswrapper[4839]: I0321 04:44:13.639785 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" podUID="67dd1633-1450-4153-b0af-b6887f61944c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Mar 21 04:44:14 crc kubenswrapper[4839]: I0321 04:44:14.581875 4839 generic.go:334] "Generic (PLEG): container finished" podID="848aa53a-bd67-4733-aad7-6ac0f6fc0a15" containerID="2233fa3f3ad560ada373befd98764d7c67680bcb094c6c63415e8ef4dc05b7f7" exitCode=0 Mar 21 04:44:14 crc kubenswrapper[4839]: I0321 04:44:14.581936 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5rr4j" event={"ID":"848aa53a-bd67-4733-aad7-6ac0f6fc0a15","Type":"ContainerDied","Data":"2233fa3f3ad560ada373befd98764d7c67680bcb094c6c63415e8ef4dc05b7f7"} Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.259594 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567804-jn7hw" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.266641 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ng2tw" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.412964 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8phmg\" (UniqueName: \"kubernetes.io/projected/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-kube-api-access-8phmg\") pod \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.413063 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrqtc\" (UniqueName: \"kubernetes.io/projected/117f0438-5ab3-4616-b574-c5bbc43e8ac9-kube-api-access-rrqtc\") pod \"117f0438-5ab3-4616-b574-c5bbc43e8ac9\" (UID: \"117f0438-5ab3-4616-b574-c5bbc43e8ac9\") " Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.413134 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-config-data\") pod \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.413211 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-db-sync-config-data\") pod \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.413272 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-combined-ca-bundle\") pod \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\" (UID: \"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1\") " Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.420517 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-kube-api-access-8phmg" (OuterVolumeSpecName: "kube-api-access-8phmg") pod "2cc1dfb9-8108-46e5-8dc5-5b555590ecc1" (UID: "2cc1dfb9-8108-46e5-8dc5-5b555590ecc1"). InnerVolumeSpecName "kube-api-access-8phmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.420966 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2cc1dfb9-8108-46e5-8dc5-5b555590ecc1" (UID: "2cc1dfb9-8108-46e5-8dc5-5b555590ecc1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.421209 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/117f0438-5ab3-4616-b574-c5bbc43e8ac9-kube-api-access-rrqtc" (OuterVolumeSpecName: "kube-api-access-rrqtc") pod "117f0438-5ab3-4616-b574-c5bbc43e8ac9" (UID: "117f0438-5ab3-4616-b574-c5bbc43e8ac9"). InnerVolumeSpecName "kube-api-access-rrqtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.444013 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cc1dfb9-8108-46e5-8dc5-5b555590ecc1" (UID: "2cc1dfb9-8108-46e5-8dc5-5b555590ecc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.465682 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-config-data" (OuterVolumeSpecName: "config-data") pod "2cc1dfb9-8108-46e5-8dc5-5b555590ecc1" (UID: "2cc1dfb9-8108-46e5-8dc5-5b555590ecc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.515956 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.515989 4839 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.515999 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.516010 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8phmg\" (UniqueName: \"kubernetes.io/projected/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1-kube-api-access-8phmg\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.516019 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrqtc\" (UniqueName: \"kubernetes.io/projected/117f0438-5ab3-4616-b574-c5bbc43e8ac9-kube-api-access-rrqtc\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.597757 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567804-jn7hw" event={"ID":"117f0438-5ab3-4616-b574-c5bbc43e8ac9","Type":"ContainerDied","Data":"6abcdf109ee136fd47cf7c735b5b48052a42904b27a3f0d33fd3e2a18c075320"} Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.597796 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6abcdf109ee136fd47cf7c735b5b48052a42904b27a3f0d33fd3e2a18c075320" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.597844 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567804-jn7hw" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.605063 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ng2tw" Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.605126 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ng2tw" event={"ID":"2cc1dfb9-8108-46e5-8dc5-5b555590ecc1","Type":"ContainerDied","Data":"65ed5290939376cc07f297d06efe7a7f9acbf33da55d639c2cc318b6e8be4b9e"} Mar 21 04:44:15 crc kubenswrapper[4839]: I0321 04:44:15.605297 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65ed5290939376cc07f297d06efe7a7f9acbf33da55d639c2cc318b6e8be4b9e" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.338300 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567798-k5zv2"] Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.345694 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567798-k5zv2"] Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.464972 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad32cfd7-7b60-4c76-8df2-eb2e65b102c3" path="/var/lib/kubelet/pods/ad32cfd7-7b60-4c76-8df2-eb2e65b102c3/volumes" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.674100 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lzkn7"] Mar 21 04:44:16 crc kubenswrapper[4839]: E0321 04:44:16.674674 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="117f0438-5ab3-4616-b574-c5bbc43e8ac9" containerName="oc" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.674700 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="117f0438-5ab3-4616-b574-c5bbc43e8ac9" containerName="oc" Mar 21 04:44:16 crc kubenswrapper[4839]: E0321 04:44:16.674740 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cc1dfb9-8108-46e5-8dc5-5b555590ecc1" containerName="glance-db-sync" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.674750 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cc1dfb9-8108-46e5-8dc5-5b555590ecc1" containerName="glance-db-sync" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.674971 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cc1dfb9-8108-46e5-8dc5-5b555590ecc1" containerName="glance-db-sync" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.675007 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="117f0438-5ab3-4616-b574-c5bbc43e8ac9" containerName="oc" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.676131 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.687767 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lzkn7"] Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.840542 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gbq4\" (UniqueName: \"kubernetes.io/projected/93294d9d-21ef-43b5-bac5-35d24543d02a-kube-api-access-6gbq4\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.840848 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.840971 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.841080 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-config\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.841299 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.841368 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.943243 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.943324 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.943448 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gbq4\" (UniqueName: \"kubernetes.io/projected/93294d9d-21ef-43b5-bac5-35d24543d02a-kube-api-access-6gbq4\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.943508 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.943542 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.943591 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-config\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.944178 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.944609 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.944877 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.945007 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.945845 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-config\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:16 crc kubenswrapper[4839]: I0321 04:44:16.963690 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gbq4\" (UniqueName: \"kubernetes.io/projected/93294d9d-21ef-43b5-bac5-35d24543d02a-kube-api-access-6gbq4\") pod \"dnsmasq-dns-56df8fb6b7-lzkn7\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.003351 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:17 crc kubenswrapper[4839]: E0321 04:44:17.062231 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 21 04:44:17 crc kubenswrapper[4839]: E0321 04:44:17.062689 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5d5h58ch665h5fch55h67bh669h589h654h84hc9h86h5f5h5b4h678h586h548h96hf5h55dh59h594hc9h5c6h85h94h587h58ch57fh8fh67dh58q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5zddh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(6c266726-5bfd-4519-bdd5-9db7f6a77df4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.713637 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.715302 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.717309 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.717973 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-v5bc4" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.718008 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.741001 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.859525 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-logs\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.859618 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.859663 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wc6r\" (UniqueName: \"kubernetes.io/projected/6008d784-c50b-4079-a7b4-c160b8202956-kube-api-access-7wc6r\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.859681 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.859697 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-scripts\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.859737 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-config-data\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.859920 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.873783 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.875611 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.882370 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.882629 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.962109 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wc6r\" (UniqueName: \"kubernetes.io/projected/6008d784-c50b-4079-a7b4-c160b8202956-kube-api-access-7wc6r\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.962173 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.962201 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-scripts\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.962263 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-config-data\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.962334 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.962433 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-logs\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.962502 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.963096 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.963420 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-logs\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.963724 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.968375 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-scripts\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.969720 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.982004 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-config-data\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:17 crc kubenswrapper[4839]: I0321 04:44:17.996484 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wc6r\" (UniqueName: \"kubernetes.io/projected/6008d784-c50b-4079-a7b4-c160b8202956-kube-api-access-7wc6r\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.015107 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.038718 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.064751 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.064831 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdqdc\" (UniqueName: \"kubernetes.io/projected/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-kube-api-access-zdqdc\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.064855 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.064966 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.065009 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.065039 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.065065 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-logs\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.166877 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.167345 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.167443 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.167635 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.167518 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.170145 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-logs\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.178785 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.179243 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdqdc\" (UniqueName: \"kubernetes.io/projected/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-kube-api-access-zdqdc\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.179705 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.172295 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.170833 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-logs\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.183552 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.191905 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.196744 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.214086 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdqdc\" (UniqueName: \"kubernetes.io/projected/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-kube-api-access-zdqdc\") pod \"glance-default-internal-api-0\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.494058 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:18 crc kubenswrapper[4839]: I0321 04:44:18.639785 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" podUID="67dd1633-1450-4153-b0af-b6887f61944c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Mar 21 04:44:19 crc kubenswrapper[4839]: I0321 04:44:19.891820 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:44:19 crc kubenswrapper[4839]: I0321 04:44:19.969243 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:44:23 crc kubenswrapper[4839]: E0321 04:44:23.631806 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 21 04:44:23 crc kubenswrapper[4839]: E0321 04:44:23.632560 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n657h99h5ffh5f6hc8hdchfdh66fh555h95h5dfh64ch574h555h65dh54dhbfh4h688h75h54fhc6h5f5h7bh4h5b8h679hc6h5d7h687h544h5b5q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s9ntv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-75c94899fc-bkxlk_openstack(f9b42e2e-3015-4ae1-a3a9-3eb96949b021): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:44:23 crc kubenswrapper[4839]: E0321 04:44:23.634799 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-75c94899fc-bkxlk" podUID="f9b42e2e-3015-4ae1-a3a9-3eb96949b021" Mar 21 04:44:23 crc kubenswrapper[4839]: E0321 04:44:23.636592 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 21 04:44:23 crc kubenswrapper[4839]: E0321 04:44:23.636704 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n65fh654hb7h645hd4h555h58ch548h665hdfh9fh5f9h5dfh5b4h5d4h546h596h57ch685h79h96h5h5b5hffhcdh5b7h547h549h99h5fchdhd5q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5ccwx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-554fbfcbdf-wqcc5_openstack(02db0b32-3683-4d02-b645-3cea2cd59b7d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:44:23 crc kubenswrapper[4839]: E0321 04:44:23.638672 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-554fbfcbdf-wqcc5" podUID="02db0b32-3683-4d02-b645-3cea2cd59b7d" Mar 21 04:44:23 crc kubenswrapper[4839]: I0321 04:44:23.641474 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" podUID="67dd1633-1450-4153-b0af-b6887f61944c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Mar 21 04:44:23 crc kubenswrapper[4839]: I0321 04:44:23.641656 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:44:23 crc kubenswrapper[4839]: E0321 04:44:23.651954 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 21 04:44:23 crc kubenswrapper[4839]: E0321 04:44:23.652158 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nd6h56bh595h678h5cfh645hfdhc7h8bh78h68ch6ch86h556hb4h56fh576h5ch5dh54ch67h658hdh579h65ch598hd4h8fh666h66dh586h544q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6t7wq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7c9d7f5-72d27_openstack(3193915f-60d3-4c8e-aa15-858213ce011c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:44:23 crc kubenswrapper[4839]: E0321 04:44:23.656820 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7c9d7f5-72d27" podUID="3193915f-60d3-4c8e-aa15-858213ce011c" Mar 21 04:44:29 crc kubenswrapper[4839]: I0321 04:44:29.737871 4839 generic.go:334] "Generic (PLEG): container finished" podID="625a99bd-bc01-400e-8e9c-1f5eff390466" containerID="dfcec3a2306ecb1c0b0e9a1bd05577683dcbd7efc3319d4ee942c6e22862d913" exitCode=0 Mar 21 04:44:29 crc kubenswrapper[4839]: I0321 04:44:29.737921 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nm9t5" event={"ID":"625a99bd-bc01-400e-8e9c-1f5eff390466","Type":"ContainerDied","Data":"dfcec3a2306ecb1c0b0e9a1bd05577683dcbd7efc3319d4ee942c6e22862d913"} Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.350291 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.359787 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.375777 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.379466 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.386375 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.443688 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksgp4\" (UniqueName: \"kubernetes.io/projected/67dd1633-1450-4153-b0af-b6887f61944c-kube-api-access-ksgp4\") pod \"67dd1633-1450-4153-b0af-b6887f61944c\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.443751 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-scripts\") pod \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.443799 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-scripts\") pod \"3193915f-60d3-4c8e-aa15-858213ce011c\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.443825 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t7wq\" (UniqueName: \"kubernetes.io/projected/3193915f-60d3-4c8e-aa15-858213ce011c-kube-api-access-6t7wq\") pod \"3193915f-60d3-4c8e-aa15-858213ce011c\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.443843 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3193915f-60d3-4c8e-aa15-858213ce011c-horizon-secret-key\") pod \"3193915f-60d3-4c8e-aa15-858213ce011c\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.443880 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-scripts\") pod \"02db0b32-3683-4d02-b645-3cea2cd59b7d\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.443900 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-scripts\") pod \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.443925 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-logs\") pod \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.443954 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-credential-keys\") pod \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.443971 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3193915f-60d3-4c8e-aa15-858213ce011c-logs\") pod \"3193915f-60d3-4c8e-aa15-858213ce011c\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444008 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqg7c\" (UniqueName: \"kubernetes.io/projected/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-kube-api-access-rqg7c\") pod \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444026 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/02db0b32-3683-4d02-b645-3cea2cd59b7d-horizon-secret-key\") pod \"02db0b32-3683-4d02-b645-3cea2cd59b7d\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444055 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-config\") pod \"67dd1633-1450-4153-b0af-b6887f61944c\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444080 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9ntv\" (UniqueName: \"kubernetes.io/projected/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-kube-api-access-s9ntv\") pod \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444115 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-config-data\") pod \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444144 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-config-data\") pod \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444162 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-fernet-keys\") pod \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444194 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-nb\") pod \"67dd1633-1450-4153-b0af-b6887f61944c\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444226 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ccwx\" (UniqueName: \"kubernetes.io/projected/02db0b32-3683-4d02-b645-3cea2cd59b7d-kube-api-access-5ccwx\") pod \"02db0b32-3683-4d02-b645-3cea2cd59b7d\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444250 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-config-data\") pod \"3193915f-60d3-4c8e-aa15-858213ce011c\" (UID: \"3193915f-60d3-4c8e-aa15-858213ce011c\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444265 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-sb\") pod \"67dd1633-1450-4153-b0af-b6887f61944c\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444284 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-combined-ca-bundle\") pod \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\" (UID: \"848aa53a-bd67-4733-aad7-6ac0f6fc0a15\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444303 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-config-data\") pod \"02db0b32-3683-4d02-b645-3cea2cd59b7d\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444321 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02db0b32-3683-4d02-b645-3cea2cd59b7d-logs\") pod \"02db0b32-3683-4d02-b645-3cea2cd59b7d\" (UID: \"02db0b32-3683-4d02-b645-3cea2cd59b7d\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444343 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-dns-svc\") pod \"67dd1633-1450-4153-b0af-b6887f61944c\" (UID: \"67dd1633-1450-4153-b0af-b6887f61944c\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.444370 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-horizon-secret-key\") pod \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\" (UID: \"f9b42e2e-3015-4ae1-a3a9-3eb96949b021\") " Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.445855 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-config-data" (OuterVolumeSpecName: "config-data") pod "02db0b32-3683-4d02-b645-3cea2cd59b7d" (UID: "02db0b32-3683-4d02-b645-3cea2cd59b7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.446346 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02db0b32-3683-4d02-b645-3cea2cd59b7d-logs" (OuterVolumeSpecName: "logs") pod "02db0b32-3683-4d02-b645-3cea2cd59b7d" (UID: "02db0b32-3683-4d02-b645-3cea2cd59b7d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.447285 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-scripts" (OuterVolumeSpecName: "scripts") pod "3193915f-60d3-4c8e-aa15-858213ce011c" (UID: "3193915f-60d3-4c8e-aa15-858213ce011c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.447327 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-logs" (OuterVolumeSpecName: "logs") pod "f9b42e2e-3015-4ae1-a3a9-3eb96949b021" (UID: "f9b42e2e-3015-4ae1-a3a9-3eb96949b021"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.448836 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-scripts" (OuterVolumeSpecName: "scripts") pod "02db0b32-3683-4d02-b645-3cea2cd59b7d" (UID: "02db0b32-3683-4d02-b645-3cea2cd59b7d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.449059 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-config-data" (OuterVolumeSpecName: "config-data") pod "f9b42e2e-3015-4ae1-a3a9-3eb96949b021" (UID: "f9b42e2e-3015-4ae1-a3a9-3eb96949b021"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.449282 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-config-data" (OuterVolumeSpecName: "config-data") pod "3193915f-60d3-4c8e-aa15-858213ce011c" (UID: "3193915f-60d3-4c8e-aa15-858213ce011c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.449389 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "848aa53a-bd67-4733-aad7-6ac0f6fc0a15" (UID: "848aa53a-bd67-4733-aad7-6ac0f6fc0a15"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.449480 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67dd1633-1450-4153-b0af-b6887f61944c-kube-api-access-ksgp4" (OuterVolumeSpecName: "kube-api-access-ksgp4") pod "67dd1633-1450-4153-b0af-b6887f61944c" (UID: "67dd1633-1450-4153-b0af-b6887f61944c"). InnerVolumeSpecName "kube-api-access-ksgp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.450778 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3193915f-60d3-4c8e-aa15-858213ce011c-logs" (OuterVolumeSpecName: "logs") pod "3193915f-60d3-4c8e-aa15-858213ce011c" (UID: "3193915f-60d3-4c8e-aa15-858213ce011c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.453828 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-scripts" (OuterVolumeSpecName: "scripts") pod "f9b42e2e-3015-4ae1-a3a9-3eb96949b021" (UID: "f9b42e2e-3015-4ae1-a3a9-3eb96949b021"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.453865 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02db0b32-3683-4d02-b645-3cea2cd59b7d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "02db0b32-3683-4d02-b645-3cea2cd59b7d" (UID: "02db0b32-3683-4d02-b645-3cea2cd59b7d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.454487 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-kube-api-access-rqg7c" (OuterVolumeSpecName: "kube-api-access-rqg7c") pod "848aa53a-bd67-4733-aad7-6ac0f6fc0a15" (UID: "848aa53a-bd67-4733-aad7-6ac0f6fc0a15"). InnerVolumeSpecName "kube-api-access-rqg7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.458505 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "848aa53a-bd67-4733-aad7-6ac0f6fc0a15" (UID: "848aa53a-bd67-4733-aad7-6ac0f6fc0a15"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.459697 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-scripts" (OuterVolumeSpecName: "scripts") pod "848aa53a-bd67-4733-aad7-6ac0f6fc0a15" (UID: "848aa53a-bd67-4733-aad7-6ac0f6fc0a15"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.464712 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3193915f-60d3-4c8e-aa15-858213ce011c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3193915f-60d3-4c8e-aa15-858213ce011c" (UID: "3193915f-60d3-4c8e-aa15-858213ce011c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.464796 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f9b42e2e-3015-4ae1-a3a9-3eb96949b021" (UID: "f9b42e2e-3015-4ae1-a3a9-3eb96949b021"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.464836 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-kube-api-access-s9ntv" (OuterVolumeSpecName: "kube-api-access-s9ntv") pod "f9b42e2e-3015-4ae1-a3a9-3eb96949b021" (UID: "f9b42e2e-3015-4ae1-a3a9-3eb96949b021"). InnerVolumeSpecName "kube-api-access-s9ntv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.464877 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02db0b32-3683-4d02-b645-3cea2cd59b7d-kube-api-access-5ccwx" (OuterVolumeSpecName: "kube-api-access-5ccwx") pod "02db0b32-3683-4d02-b645-3cea2cd59b7d" (UID: "02db0b32-3683-4d02-b645-3cea2cd59b7d"). InnerVolumeSpecName "kube-api-access-5ccwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.464930 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3193915f-60d3-4c8e-aa15-858213ce011c-kube-api-access-6t7wq" (OuterVolumeSpecName: "kube-api-access-6t7wq") pod "3193915f-60d3-4c8e-aa15-858213ce011c" (UID: "3193915f-60d3-4c8e-aa15-858213ce011c"). InnerVolumeSpecName "kube-api-access-6t7wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.489656 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-config-data" (OuterVolumeSpecName: "config-data") pod "848aa53a-bd67-4733-aad7-6ac0f6fc0a15" (UID: "848aa53a-bd67-4733-aad7-6ac0f6fc0a15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.493025 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "848aa53a-bd67-4733-aad7-6ac0f6fc0a15" (UID: "848aa53a-bd67-4733-aad7-6ac0f6fc0a15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.501394 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "67dd1633-1450-4153-b0af-b6887f61944c" (UID: "67dd1633-1450-4153-b0af-b6887f61944c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.504043 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-config" (OuterVolumeSpecName: "config") pod "67dd1633-1450-4153-b0af-b6887f61944c" (UID: "67dd1633-1450-4153-b0af-b6887f61944c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.509177 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "67dd1633-1450-4153-b0af-b6887f61944c" (UID: "67dd1633-1450-4153-b0af-b6887f61944c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.510083 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "67dd1633-1450-4153-b0af-b6887f61944c" (UID: "67dd1633-1450-4153-b0af-b6887f61944c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546031 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksgp4\" (UniqueName: \"kubernetes.io/projected/67dd1633-1450-4153-b0af-b6887f61944c-kube-api-access-ksgp4\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546064 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546073 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546081 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t7wq\" (UniqueName: \"kubernetes.io/projected/3193915f-60d3-4c8e-aa15-858213ce011c-kube-api-access-6t7wq\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546089 4839 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3193915f-60d3-4c8e-aa15-858213ce011c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546098 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546105 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546113 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546120 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3193915f-60d3-4c8e-aa15-858213ce011c-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546128 4839 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546136 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqg7c\" (UniqueName: \"kubernetes.io/projected/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-kube-api-access-rqg7c\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546144 4839 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/02db0b32-3683-4d02-b645-3cea2cd59b7d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546151 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546160 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9ntv\" (UniqueName: \"kubernetes.io/projected/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-kube-api-access-s9ntv\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546167 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546176 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546183 4839 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546191 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546199 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ccwx\" (UniqueName: \"kubernetes.io/projected/02db0b32-3683-4d02-b645-3cea2cd59b7d-kube-api-access-5ccwx\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546207 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3193915f-60d3-4c8e-aa15-858213ce011c-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546214 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546222 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848aa53a-bd67-4733-aad7-6ac0f6fc0a15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546229 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02db0b32-3683-4d02-b645-3cea2cd59b7d-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546237 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02db0b32-3683-4d02-b645-3cea2cd59b7d-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546244 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67dd1633-1450-4153-b0af-b6887f61944c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.546252 4839 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f9b42e2e-3015-4ae1-a3a9-3eb96949b021-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.756215 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75c94899fc-bkxlk" event={"ID":"f9b42e2e-3015-4ae1-a3a9-3eb96949b021","Type":"ContainerDied","Data":"04fe2b7baf42bfa7035c15041a5de93662e87c4359a787bd7c9b47e57eb2a7fa"} Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.756300 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75c94899fc-bkxlk" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.766453 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5rr4j" event={"ID":"848aa53a-bd67-4733-aad7-6ac0f6fc0a15","Type":"ContainerDied","Data":"70f7b322b7c3ad74c3e8a9620d17f9758e57775147f74650f1a626aa0f7a8463"} Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.766494 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70f7b322b7c3ad74c3e8a9620d17f9758e57775147f74650f1a626aa0f7a8463" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.766559 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5rr4j" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.775468 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" event={"ID":"67dd1633-1450-4153-b0af-b6887f61944c","Type":"ContainerDied","Data":"57de16c4224a656e8f3fcae76650a94702fb081fd5f9e8c3856fcde976889201"} Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.775519 4839 scope.go:117] "RemoveContainer" containerID="66a460b182805c08827a7b4f6980d98fea84c8290c7b4fe1cb071b3630a6c029" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.775591 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.778437 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c9d7f5-72d27" event={"ID":"3193915f-60d3-4c8e-aa15-858213ce011c","Type":"ContainerDied","Data":"6e4a51f272197d48dcbc81a0aecd9739f163dd5e41504342f75386e4fcf464f5"} Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.778498 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c9d7f5-72d27" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.780111 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-554fbfcbdf-wqcc5" event={"ID":"02db0b32-3683-4d02-b645-3cea2cd59b7d","Type":"ContainerDied","Data":"8e3dd636e6fac659887299acf1ac0e45d1e3d4824f9b0e8c44ea6e8f2b5429e5"} Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.780186 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-554fbfcbdf-wqcc5" Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.851320 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75c94899fc-bkxlk"] Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.860378 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-75c94899fc-bkxlk"] Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.883889 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c9d7f5-72d27"] Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.924676 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c9d7f5-72d27"] Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.939819 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-554fbfcbdf-wqcc5"] Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.947744 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-554fbfcbdf-wqcc5"] Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.957265 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zkqb7"] Mar 21 04:44:31 crc kubenswrapper[4839]: I0321 04:44:31.964936 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-zkqb7"] Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.435063 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-5rr4j"] Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.444061 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-5rr4j"] Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.464377 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02db0b32-3683-4d02-b645-3cea2cd59b7d" path="/var/lib/kubelet/pods/02db0b32-3683-4d02-b645-3cea2cd59b7d/volumes" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.465475 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3193915f-60d3-4c8e-aa15-858213ce011c" path="/var/lib/kubelet/pods/3193915f-60d3-4c8e-aa15-858213ce011c/volumes" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.466385 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67dd1633-1450-4153-b0af-b6887f61944c" path="/var/lib/kubelet/pods/67dd1633-1450-4153-b0af-b6887f61944c/volumes" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.467754 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="848aa53a-bd67-4733-aad7-6ac0f6fc0a15" path="/var/lib/kubelet/pods/848aa53a-bd67-4733-aad7-6ac0f6fc0a15/volumes" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.468715 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9b42e2e-3015-4ae1-a3a9-3eb96949b021" path="/var/lib/kubelet/pods/f9b42e2e-3015-4ae1-a3a9-3eb96949b021/volumes" Mar 21 04:44:32 crc kubenswrapper[4839]: E0321 04:44:32.476781 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 21 04:44:32 crc kubenswrapper[4839]: E0321 04:44:32.476984 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g6mpt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-qfjms_openstack(6000d2d4-e84a-443f-9094-ab999541331d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 04:44:32 crc kubenswrapper[4839]: E0321 04:44:32.480760 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-qfjms" podUID="6000d2d4-e84a-443f-9094-ab999541331d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.534544 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.538357 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ts52d"] Mar 21 04:44:32 crc kubenswrapper[4839]: E0321 04:44:32.538748 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67dd1633-1450-4153-b0af-b6887f61944c" containerName="init" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.538768 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="67dd1633-1450-4153-b0af-b6887f61944c" containerName="init" Mar 21 04:44:32 crc kubenswrapper[4839]: E0321 04:44:32.538778 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848aa53a-bd67-4733-aad7-6ac0f6fc0a15" containerName="keystone-bootstrap" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.538787 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="848aa53a-bd67-4733-aad7-6ac0f6fc0a15" containerName="keystone-bootstrap" Mar 21 04:44:32 crc kubenswrapper[4839]: E0321 04:44:32.538820 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="625a99bd-bc01-400e-8e9c-1f5eff390466" containerName="neutron-db-sync" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.538826 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="625a99bd-bc01-400e-8e9c-1f5eff390466" containerName="neutron-db-sync" Mar 21 04:44:32 crc kubenswrapper[4839]: E0321 04:44:32.538837 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67dd1633-1450-4153-b0af-b6887f61944c" containerName="dnsmasq-dns" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.538842 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="67dd1633-1450-4153-b0af-b6887f61944c" containerName="dnsmasq-dns" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.539009 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="67dd1633-1450-4153-b0af-b6887f61944c" containerName="dnsmasq-dns" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.539060 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="625a99bd-bc01-400e-8e9c-1f5eff390466" containerName="neutron-db-sync" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.539072 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="848aa53a-bd67-4733-aad7-6ac0f6fc0a15" containerName="keystone-bootstrap" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.539649 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.542285 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.542533 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.542845 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pzsvm" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.543090 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.543173 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.544862 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ts52d"] Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.564168 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-config\") pod \"625a99bd-bc01-400e-8e9c-1f5eff390466\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.564366 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhrt9\" (UniqueName: \"kubernetes.io/projected/625a99bd-bc01-400e-8e9c-1f5eff390466-kube-api-access-zhrt9\") pod \"625a99bd-bc01-400e-8e9c-1f5eff390466\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.564444 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-combined-ca-bundle\") pod \"625a99bd-bc01-400e-8e9c-1f5eff390466\" (UID: \"625a99bd-bc01-400e-8e9c-1f5eff390466\") " Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.564790 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-credential-keys\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.564905 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-scripts\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.564958 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-fernet-keys\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.565033 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-combined-ca-bundle\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.565202 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h955\" (UniqueName: \"kubernetes.io/projected/7cada35b-7e7f-4d22-895f-588b90e48c70-kube-api-access-5h955\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.565235 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-config-data\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.567050 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/625a99bd-bc01-400e-8e9c-1f5eff390466-kube-api-access-zhrt9" (OuterVolumeSpecName: "kube-api-access-zhrt9") pod "625a99bd-bc01-400e-8e9c-1f5eff390466" (UID: "625a99bd-bc01-400e-8e9c-1f5eff390466"). InnerVolumeSpecName "kube-api-access-zhrt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.588484 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "625a99bd-bc01-400e-8e9c-1f5eff390466" (UID: "625a99bd-bc01-400e-8e9c-1f5eff390466"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.593199 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-config" (OuterVolumeSpecName: "config") pod "625a99bd-bc01-400e-8e9c-1f5eff390466" (UID: "625a99bd-bc01-400e-8e9c-1f5eff390466"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.666599 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h955\" (UniqueName: \"kubernetes.io/projected/7cada35b-7e7f-4d22-895f-588b90e48c70-kube-api-access-5h955\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.666649 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-config-data\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.666682 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-credential-keys\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.666747 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-scripts\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.666786 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-fernet-keys\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.666834 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-combined-ca-bundle\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.666881 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhrt9\" (UniqueName: \"kubernetes.io/projected/625a99bd-bc01-400e-8e9c-1f5eff390466-kube-api-access-zhrt9\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.666893 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.666902 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/625a99bd-bc01-400e-8e9c-1f5eff390466-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.670727 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-scripts\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.670832 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-combined-ca-bundle\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.671783 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-credential-keys\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.679622 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-fernet-keys\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.680779 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-config-data\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.681887 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h955\" (UniqueName: \"kubernetes.io/projected/7cada35b-7e7f-4d22-895f-588b90e48c70-kube-api-access-5h955\") pod \"keystone-bootstrap-ts52d\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.780644 4839 scope.go:117] "RemoveContainer" containerID="285d767665dbf1b22bee7f8005f18b61072968dd727d608ba30f4f564d8882bb" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.795920 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nm9t5" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.795966 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nm9t5" event={"ID":"625a99bd-bc01-400e-8e9c-1f5eff390466","Type":"ContainerDied","Data":"829be773cbac605730f273e58ffebe5c5615f40baf855f3c2212fe6c649c7cf3"} Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.796001 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="829be773cbac605730f273e58ffebe5c5615f40baf855f3c2212fe6c649c7cf3" Mar 21 04:44:32 crc kubenswrapper[4839]: E0321 04:44:32.800827 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-qfjms" podUID="6000d2d4-e84a-443f-9094-ab999541331d" Mar 21 04:44:32 crc kubenswrapper[4839]: I0321 04:44:32.920807 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.309627 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84c6c985f8-v5cmh"] Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.322044 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.328561 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9c97f4dbd-k2scs"] Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.550548 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ts52d"] Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.575399 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lzkn7"] Mar 21 04:44:33 crc kubenswrapper[4839]: W0321 04:44:33.577513 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cada35b_7e7f_4d22_895f_588b90e48c70.slice/crio-e31bc068934ef91e47e0dcf3fab8f1d0d0da5df66baa85de8d2947608e34dbb6 WatchSource:0}: Error finding container e31bc068934ef91e47e0dcf3fab8f1d0d0da5df66baa85de8d2947608e34dbb6: Status 404 returned error can't find the container with id e31bc068934ef91e47e0dcf3fab8f1d0d0da5df66baa85de8d2947608e34dbb6 Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.640066 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-zkqb7" podUID="67dd1633-1450-4153-b0af-b6887f61944c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.811782 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lzkn7"] Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.813767 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ts52d" event={"ID":"7cada35b-7e7f-4d22-895f-588b90e48c70","Type":"ContainerStarted","Data":"e31bc068934ef91e47e0dcf3fab8f1d0d0da5df66baa85de8d2947608e34dbb6"} Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.818275 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" event={"ID":"93294d9d-21ef-43b5-bac5-35d24543d02a","Type":"ContainerStarted","Data":"2250ba76cd4a4ca0b77733a0ad0032843aaa7963ae9b1e3267d54ad5c182b9aa"} Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.822093 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84c6c985f8-v5cmh" event={"ID":"b3b26c3a-55d5-442a-9c31-187b0aa60f90","Type":"ContainerStarted","Data":"b5753b189f3fee68b09fb93ec56788b978b6a6741d48ecf04c45ca76fee101e1"} Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.826825 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9c97f4dbd-k2scs" event={"ID":"579308eb-854d-4160-ad35-8677f2d0e634","Type":"ContainerStarted","Data":"03888d296ca7ab53ceb5c3abefacbc011c167a0d8c4b77409fc31b4badc571f5"} Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.856827 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-t8kxj" event={"ID":"6d0e1745-6e0b-475c-a1de-d049018abea6","Type":"ContainerStarted","Data":"143fbf65afa2773912765c6bb85681ce2740b19aa556d5df9884eb40a87ddf95"} Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.871397 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-w74nb"] Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.881941 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wdddk" event={"ID":"e6e87cbd-1f46-4fa0-9529-8250f9fee21c","Type":"ContainerStarted","Data":"3f2d4fa09933468a7b6e88aaba055705019ffd1468416047f45f0ae828c805fe"} Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.882249 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.889956 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d447b4d96-qkb69"] Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.895433 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.902516 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.902917 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.903051 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.903172 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-5mrkq" Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.908903 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-w74nb"] Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.937168 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-t8kxj" podStartSLOduration=4.214767286 podStartE2EDuration="34.937134793s" podCreationTimestamp="2026-03-21 04:43:59 +0000 UTC" firstStartedPulling="2026-03-21 04:44:01.699786965 +0000 UTC m=+1246.027573641" lastFinishedPulling="2026-03-21 04:44:32.422154462 +0000 UTC m=+1276.749941148" observedRunningTime="2026-03-21 04:44:33.884727237 +0000 UTC m=+1278.212513913" watchObservedRunningTime="2026-03-21 04:44:33.937134793 +0000 UTC m=+1278.264921489" Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.962380 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d447b4d96-qkb69"] Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.999872 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:33 crc kubenswrapper[4839]: I0321 04:44:33.999940 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-ovndb-tls-certs\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:33.999982 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-svc\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.000279 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.000427 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-config\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.000675 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.000759 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-httpd-config\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.001239 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-config\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.001266 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swz79\" (UniqueName: \"kubernetes.io/projected/12b60d89-b044-4822-bc95-47567123e883-kube-api-access-swz79\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.002363 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-wdddk" podStartSLOduration=5.121571794 podStartE2EDuration="35.002339077s" podCreationTimestamp="2026-03-21 04:43:59 +0000 UTC" firstStartedPulling="2026-03-21 04:44:01.363717193 +0000 UTC m=+1245.691503869" lastFinishedPulling="2026-03-21 04:44:31.244484476 +0000 UTC m=+1275.572271152" observedRunningTime="2026-03-21 04:44:33.957277206 +0000 UTC m=+1278.285063892" watchObservedRunningTime="2026-03-21 04:44:34.002339077 +0000 UTC m=+1278.330125753" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.004633 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf4cm\" (UniqueName: \"kubernetes.io/projected/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-kube-api-access-sf4cm\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.004674 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-combined-ca-bundle\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110366 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110414 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-config\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110460 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110484 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-httpd-config\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110593 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-config\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110619 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swz79\" (UniqueName: \"kubernetes.io/projected/12b60d89-b044-4822-bc95-47567123e883-kube-api-access-swz79\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110673 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf4cm\" (UniqueName: \"kubernetes.io/projected/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-kube-api-access-sf4cm\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110693 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-combined-ca-bundle\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110761 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110796 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-ovndb-tls-certs\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.110865 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-svc\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.111254 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-config\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.111392 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.111732 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-svc\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.112066 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.113008 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.120504 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-config\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.122180 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-httpd-config\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.124043 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-combined-ca-bundle\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.124336 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-ovndb-tls-certs\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.130677 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf4cm\" (UniqueName: \"kubernetes.io/projected/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-kube-api-access-sf4cm\") pod \"dnsmasq-dns-6b7b667979-w74nb\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.134922 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swz79\" (UniqueName: \"kubernetes.io/projected/12b60d89-b044-4822-bc95-47567123e883-kube-api-access-swz79\") pod \"neutron-d447b4d96-qkb69\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.207315 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.221110 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.290433 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:44:34 crc kubenswrapper[4839]: W0321 04:44:34.298007 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6008d784_c50b_4079_a7b4_c160b8202956.slice/crio-bc5e3457eb4db025b9a82cb5d0e3fd43071910df3cfe4e9dea88ba3c8fb8cc99 WatchSource:0}: Error finding container bc5e3457eb4db025b9a82cb5d0e3fd43071910df3cfe4e9dea88ba3c8fb8cc99: Status 404 returned error can't find the container with id bc5e3457eb4db025b9a82cb5d0e3fd43071910df3cfe4e9dea88ba3c8fb8cc99 Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.905546 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ts52d" event={"ID":"7cada35b-7e7f-4d22-895f-588b90e48c70","Type":"ContainerStarted","Data":"848904d0e2ac99454595812a77ae5d4f4ec6aacc9198508a3ea49e5fd72d6ee4"} Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.914764 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6008d784-c50b-4079-a7b4-c160b8202956","Type":"ContainerStarted","Data":"bc5e3457eb4db025b9a82cb5d0e3fd43071910df3cfe4e9dea88ba3c8fb8cc99"} Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.921408 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c266726-5bfd-4519-bdd5-9db7f6a77df4","Type":"ContainerStarted","Data":"0a64d9a20f4f5b5d0b9782608a440c655769c9db2754bb98b7278494dc83ae14"} Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.923962 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84c6c985f8-v5cmh" event={"ID":"b3b26c3a-55d5-442a-9c31-187b0aa60f90","Type":"ContainerStarted","Data":"0bc7ef10848b0da5e68b6c3552cc343013046d2176bf665b0d2389f263149510"} Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.931365 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d447b4d96-qkb69"] Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.935274 4839 generic.go:334] "Generic (PLEG): container finished" podID="93294d9d-21ef-43b5-bac5-35d24543d02a" containerID="74f2d6e648e65be5fa3f29d363526c322d6b87d1a1296638d9d2a658969d838e" exitCode=0 Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.935381 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" event={"ID":"93294d9d-21ef-43b5-bac5-35d24543d02a","Type":"ContainerDied","Data":"74f2d6e648e65be5fa3f29d363526c322d6b87d1a1296638d9d2a658969d838e"} Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.939294 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ts52d" podStartSLOduration=2.9392708069999998 podStartE2EDuration="2.939270807s" podCreationTimestamp="2026-03-21 04:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:34.927999672 +0000 UTC m=+1279.255786348" watchObservedRunningTime="2026-03-21 04:44:34.939270807 +0000 UTC m=+1279.267057483" Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.954406 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9c97f4dbd-k2scs" event={"ID":"579308eb-854d-4160-ad35-8677f2d0e634","Type":"ContainerStarted","Data":"04b6e51342deeff7b4d476258be9896a458d5b6ccb291866248471c868e4ea4c"} Mar 21 04:44:34 crc kubenswrapper[4839]: I0321 04:44:34.978799 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-w74nb"] Mar 21 04:44:34 crc kubenswrapper[4839]: W0321 04:44:34.992415 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeba2ea1_e2d1_46d2_8e89_982cd58f3b15.slice/crio-594e402f9f6a6f0c691bdaa0d2b0e852622545812087194fbffc061c3f4fc05b WatchSource:0}: Error finding container 594e402f9f6a6f0c691bdaa0d2b0e852622545812087194fbffc061c3f4fc05b: Status 404 returned error can't find the container with id 594e402f9f6a6f0c691bdaa0d2b0e852622545812087194fbffc061c3f4fc05b Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.125482 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:44:35 crc kubenswrapper[4839]: W0321 04:44:35.130452 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d7f6c0e_7fdd_4ed0_94ad_e1b044f296f6.slice/crio-895e0558e8de36ac56f79042a259db632c9fdc941b4ad6158154e4d29f6f1e2e WatchSource:0}: Error finding container 895e0558e8de36ac56f79042a259db632c9fdc941b4ad6158154e4d29f6f1e2e: Status 404 returned error can't find the container with id 895e0558e8de36ac56f79042a259db632c9fdc941b4ad6158154e4d29f6f1e2e Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.471073 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.552083 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-config\") pod \"93294d9d-21ef-43b5-bac5-35d24543d02a\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.552404 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-svc\") pod \"93294d9d-21ef-43b5-bac5-35d24543d02a\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.552484 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-sb\") pod \"93294d9d-21ef-43b5-bac5-35d24543d02a\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.552529 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-nb\") pod \"93294d9d-21ef-43b5-bac5-35d24543d02a\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.552634 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gbq4\" (UniqueName: \"kubernetes.io/projected/93294d9d-21ef-43b5-bac5-35d24543d02a-kube-api-access-6gbq4\") pod \"93294d9d-21ef-43b5-bac5-35d24543d02a\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.552664 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-swift-storage-0\") pod \"93294d9d-21ef-43b5-bac5-35d24543d02a\" (UID: \"93294d9d-21ef-43b5-bac5-35d24543d02a\") " Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.586687 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-config" (OuterVolumeSpecName: "config") pod "93294d9d-21ef-43b5-bac5-35d24543d02a" (UID: "93294d9d-21ef-43b5-bac5-35d24543d02a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.616202 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "93294d9d-21ef-43b5-bac5-35d24543d02a" (UID: "93294d9d-21ef-43b5-bac5-35d24543d02a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.625133 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93294d9d-21ef-43b5-bac5-35d24543d02a-kube-api-access-6gbq4" (OuterVolumeSpecName: "kube-api-access-6gbq4") pod "93294d9d-21ef-43b5-bac5-35d24543d02a" (UID: "93294d9d-21ef-43b5-bac5-35d24543d02a"). InnerVolumeSpecName "kube-api-access-6gbq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.626699 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "93294d9d-21ef-43b5-bac5-35d24543d02a" (UID: "93294d9d-21ef-43b5-bac5-35d24543d02a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.626776 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "93294d9d-21ef-43b5-bac5-35d24543d02a" (UID: "93294d9d-21ef-43b5-bac5-35d24543d02a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.637021 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "93294d9d-21ef-43b5-bac5-35d24543d02a" (UID: "93294d9d-21ef-43b5-bac5-35d24543d02a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.665041 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.665072 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.665114 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.665127 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gbq4\" (UniqueName: \"kubernetes.io/projected/93294d9d-21ef-43b5-bac5-35d24543d02a-kube-api-access-6gbq4\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.665137 4839 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.665148 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93294d9d-21ef-43b5-bac5-35d24543d02a-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.986912 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d447b4d96-qkb69" event={"ID":"12b60d89-b044-4822-bc95-47567123e883","Type":"ContainerStarted","Data":"67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90"} Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.987040 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d447b4d96-qkb69" event={"ID":"12b60d89-b044-4822-bc95-47567123e883","Type":"ContainerStarted","Data":"e44c81ce53fb7cb7cb67615e87aced0fc7bd4c886cbd53ea268fc23a5209a592"} Mar 21 04:44:35 crc kubenswrapper[4839]: I0321 04:44:35.993523 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6","Type":"ContainerStarted","Data":"895e0558e8de36ac56f79042a259db632c9fdc941b4ad6158154e4d29f6f1e2e"} Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.003023 4839 generic.go:334] "Generic (PLEG): container finished" podID="aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" containerID="4d47c811396da6675e98576b8f9d542f9a6e50f5a5df44132f5048a5caae6747" exitCode=0 Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.003102 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" event={"ID":"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15","Type":"ContainerDied","Data":"4d47c811396da6675e98576b8f9d542f9a6e50f5a5df44132f5048a5caae6747"} Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.003129 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" event={"ID":"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15","Type":"ContainerStarted","Data":"594e402f9f6a6f0c691bdaa0d2b0e852622545812087194fbffc061c3f4fc05b"} Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.010624 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6008d784-c50b-4079-a7b4-c160b8202956","Type":"ContainerStarted","Data":"656d2bd6e96a6dfef03b94b31d13a8f2dda820b33aef3803e391faf4b5d221eb"} Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.016168 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84c6c985f8-v5cmh" event={"ID":"b3b26c3a-55d5-442a-9c31-187b0aa60f90","Type":"ContainerStarted","Data":"e004b9646c4df34c1d5bba67912a6fa76f3cccc25c7980ab777e369e37ce16c9"} Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.079597 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" event={"ID":"93294d9d-21ef-43b5-bac5-35d24543d02a","Type":"ContainerDied","Data":"2250ba76cd4a4ca0b77733a0ad0032843aaa7963ae9b1e3267d54ad5c182b9aa"} Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.079668 4839 scope.go:117] "RemoveContainer" containerID="74f2d6e648e65be5fa3f29d363526c322d6b87d1a1296638d9d2a658969d838e" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.079907 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-lzkn7" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.112092 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9c97f4dbd-k2scs" event={"ID":"579308eb-854d-4160-ad35-8677f2d0e634","Type":"ContainerStarted","Data":"8ea4c51163beea5b097beaa576e1811227967d58157ac9043d0eaf20c8c0eed3"} Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.122501 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-84c6c985f8-v5cmh" podStartSLOduration=26.966153139 podStartE2EDuration="28.122477398s" podCreationTimestamp="2026-03-21 04:44:08 +0000 UTC" firstStartedPulling="2026-03-21 04:44:33.322392995 +0000 UTC m=+1277.650179671" lastFinishedPulling="2026-03-21 04:44:34.478717254 +0000 UTC m=+1278.806503930" observedRunningTime="2026-03-21 04:44:36.111609584 +0000 UTC m=+1280.439396280" watchObservedRunningTime="2026-03-21 04:44:36.122477398 +0000 UTC m=+1280.450264084" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.193619 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-9c97f4dbd-k2scs" podStartSLOduration=27.034844181 podStartE2EDuration="28.193597128s" podCreationTimestamp="2026-03-21 04:44:08 +0000 UTC" firstStartedPulling="2026-03-21 04:44:33.321792338 +0000 UTC m=+1277.649579014" lastFinishedPulling="2026-03-21 04:44:34.480545285 +0000 UTC m=+1278.808331961" observedRunningTime="2026-03-21 04:44:36.151986433 +0000 UTC m=+1280.479773129" watchObservedRunningTime="2026-03-21 04:44:36.193597128 +0000 UTC m=+1280.521383804" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.211493 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lzkn7"] Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.225017 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lzkn7"] Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.477801 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93294d9d-21ef-43b5-bac5-35d24543d02a" path="/var/lib/kubelet/pods/93294d9d-21ef-43b5-bac5-35d24543d02a/volumes" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.519094 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-98964f649-mrjrt"] Mar 21 04:44:36 crc kubenswrapper[4839]: E0321 04:44:36.520103 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93294d9d-21ef-43b5-bac5-35d24543d02a" containerName="init" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.520128 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="93294d9d-21ef-43b5-bac5-35d24543d02a" containerName="init" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.520351 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="93294d9d-21ef-43b5-bac5-35d24543d02a" containerName="init" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.521449 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.525809 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.526087 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.548727 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-98964f649-mrjrt"] Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.609508 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-httpd-config\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.609815 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-config\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.609934 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-combined-ca-bundle\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.610076 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-internal-tls-certs\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.610245 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-ovndb-tls-certs\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.610464 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-public-tls-certs\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.610538 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbt6d\" (UniqueName: \"kubernetes.io/projected/e965d008-890b-408c-a5a8-823aca00140a-kube-api-access-wbt6d\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.712790 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-httpd-config\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.712832 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-config\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.712858 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-combined-ca-bundle\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.712998 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-internal-tls-certs\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.713096 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-ovndb-tls-certs\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.713185 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-public-tls-certs\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.713211 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbt6d\" (UniqueName: \"kubernetes.io/projected/e965d008-890b-408c-a5a8-823aca00140a-kube-api-access-wbt6d\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.718060 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-httpd-config\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.719345 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-internal-tls-certs\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.720429 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-combined-ca-bundle\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.721616 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-public-tls-certs\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.722327 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-ovndb-tls-certs\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.723115 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-config\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.733910 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbt6d\" (UniqueName: \"kubernetes.io/projected/e965d008-890b-408c-a5a8-823aca00140a-kube-api-access-wbt6d\") pod \"neutron-98964f649-mrjrt\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:36 crc kubenswrapper[4839]: I0321 04:44:36.904062 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.141781 4839 generic.go:334] "Generic (PLEG): container finished" podID="e6e87cbd-1f46-4fa0-9529-8250f9fee21c" containerID="3f2d4fa09933468a7b6e88aaba055705019ffd1468416047f45f0ae828c805fe" exitCode=0 Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.142138 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wdddk" event={"ID":"e6e87cbd-1f46-4fa0-9529-8250f9fee21c","Type":"ContainerDied","Data":"3f2d4fa09933468a7b6e88aaba055705019ffd1468416047f45f0ae828c805fe"} Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.153906 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d447b4d96-qkb69" event={"ID":"12b60d89-b044-4822-bc95-47567123e883","Type":"ContainerStarted","Data":"b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc"} Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.154810 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.156355 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6","Type":"ContainerStarted","Data":"c262061b12e953fbbacb47f4b9530de8433e420c520843fb5f6b2637c033c0d3"} Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.162308 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" event={"ID":"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15","Type":"ContainerStarted","Data":"715b056e0e8951dcb0bce46eff3f4cc77b23970621f830f7f64bde0192431e68"} Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.162517 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.165236 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6008d784-c50b-4079-a7b4-c160b8202956" containerName="glance-log" containerID="cri-o://656d2bd6e96a6dfef03b94b31d13a8f2dda820b33aef3803e391faf4b5d221eb" gracePeriod=30 Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.165308 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6008d784-c50b-4079-a7b4-c160b8202956","Type":"ContainerStarted","Data":"0a86b207740d0f8a7f90c03c8833a6d77c693f85ec7fea71ebc1f6a1cbaaac94"} Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.165363 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6008d784-c50b-4079-a7b4-c160b8202956" containerName="glance-httpd" containerID="cri-o://0a86b207740d0f8a7f90c03c8833a6d77c693f85ec7fea71ebc1f6a1cbaaac94" gracePeriod=30 Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.239688 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-d447b4d96-qkb69" podStartSLOduration=4.239666561 podStartE2EDuration="4.239666561s" podCreationTimestamp="2026-03-21 04:44:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:37.21067525 +0000 UTC m=+1281.538461946" watchObservedRunningTime="2026-03-21 04:44:37.239666561 +0000 UTC m=+1281.567453237" Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.261746 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=21.261721628 podStartE2EDuration="21.261721628s" podCreationTimestamp="2026-03-21 04:44:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:37.232815969 +0000 UTC m=+1281.560602655" watchObservedRunningTime="2026-03-21 04:44:37.261721628 +0000 UTC m=+1281.589508304" Mar 21 04:44:37 crc kubenswrapper[4839]: I0321 04:44:37.285245 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" podStartSLOduration=4.285216935 podStartE2EDuration="4.285216935s" podCreationTimestamp="2026-03-21 04:44:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:37.256542163 +0000 UTC m=+1281.584328849" watchObservedRunningTime="2026-03-21 04:44:37.285216935 +0000 UTC m=+1281.613003621" Mar 21 04:44:38 crc kubenswrapper[4839]: I0321 04:44:38.165206 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-98964f649-mrjrt"] Mar 21 04:44:38 crc kubenswrapper[4839]: I0321 04:44:38.186337 4839 generic.go:334] "Generic (PLEG): container finished" podID="6008d784-c50b-4079-a7b4-c160b8202956" containerID="0a86b207740d0f8a7f90c03c8833a6d77c693f85ec7fea71ebc1f6a1cbaaac94" exitCode=0 Mar 21 04:44:38 crc kubenswrapper[4839]: I0321 04:44:38.186376 4839 generic.go:334] "Generic (PLEG): container finished" podID="6008d784-c50b-4079-a7b4-c160b8202956" containerID="656d2bd6e96a6dfef03b94b31d13a8f2dda820b33aef3803e391faf4b5d221eb" exitCode=143 Mar 21 04:44:38 crc kubenswrapper[4839]: I0321 04:44:38.186376 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6008d784-c50b-4079-a7b4-c160b8202956","Type":"ContainerDied","Data":"0a86b207740d0f8a7f90c03c8833a6d77c693f85ec7fea71ebc1f6a1cbaaac94"} Mar 21 04:44:38 crc kubenswrapper[4839]: I0321 04:44:38.186420 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6008d784-c50b-4079-a7b4-c160b8202956","Type":"ContainerDied","Data":"656d2bd6e96a6dfef03b94b31d13a8f2dda820b33aef3803e391faf4b5d221eb"} Mar 21 04:44:39 crc kubenswrapper[4839]: I0321 04:44:39.198157 4839 generic.go:334] "Generic (PLEG): container finished" podID="6d0e1745-6e0b-475c-a1de-d049018abea6" containerID="143fbf65afa2773912765c6bb85681ce2740b19aa556d5df9884eb40a87ddf95" exitCode=0 Mar 21 04:44:39 crc kubenswrapper[4839]: I0321 04:44:39.198737 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-t8kxj" event={"ID":"6d0e1745-6e0b-475c-a1de-d049018abea6","Type":"ContainerDied","Data":"143fbf65afa2773912765c6bb85681ce2740b19aa556d5df9884eb40a87ddf95"} Mar 21 04:44:39 crc kubenswrapper[4839]: I0321 04:44:39.202409 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6","Type":"ContainerStarted","Data":"169f86351d3c6f2a1f7c5f547a5ee33563ee9e6728a88fde0e0f1b926cb83f77"} Mar 21 04:44:39 crc kubenswrapper[4839]: I0321 04:44:39.202558 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" containerName="glance-log" containerID="cri-o://c262061b12e953fbbacb47f4b9530de8433e420c520843fb5f6b2637c033c0d3" gracePeriod=30 Mar 21 04:44:39 crc kubenswrapper[4839]: I0321 04:44:39.202872 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" containerName="glance-httpd" containerID="cri-o://169f86351d3c6f2a1f7c5f547a5ee33563ee9e6728a88fde0e0f1b926cb83f77" gracePeriod=30 Mar 21 04:44:39 crc kubenswrapper[4839]: I0321 04:44:39.205267 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98964f649-mrjrt" event={"ID":"e965d008-890b-408c-a5a8-823aca00140a","Type":"ContainerStarted","Data":"965f07a77abecc2dcc57bce66cbf446672e1ba03feecba479f6dc24ff5964cee"} Mar 21 04:44:39 crc kubenswrapper[4839]: I0321 04:44:39.261531 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:39 crc kubenswrapper[4839]: I0321 04:44:39.263205 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:44:39 crc kubenswrapper[4839]: I0321 04:44:39.328732 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:39 crc kubenswrapper[4839]: I0321 04:44:39.328784 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:44:40 crc kubenswrapper[4839]: I0321 04:44:40.218207 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6","Type":"ContainerDied","Data":"169f86351d3c6f2a1f7c5f547a5ee33563ee9e6728a88fde0e0f1b926cb83f77"} Mar 21 04:44:40 crc kubenswrapper[4839]: I0321 04:44:40.218144 4839 generic.go:334] "Generic (PLEG): container finished" podID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" containerID="169f86351d3c6f2a1f7c5f547a5ee33563ee9e6728a88fde0e0f1b926cb83f77" exitCode=0 Mar 21 04:44:40 crc kubenswrapper[4839]: I0321 04:44:40.218613 4839 generic.go:334] "Generic (PLEG): container finished" podID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" containerID="c262061b12e953fbbacb47f4b9530de8433e420c520843fb5f6b2637c033c0d3" exitCode=143 Mar 21 04:44:40 crc kubenswrapper[4839]: I0321 04:44:40.218725 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6","Type":"ContainerDied","Data":"c262061b12e953fbbacb47f4b9530de8433e420c520843fb5f6b2637c033c0d3"} Mar 21 04:44:41 crc kubenswrapper[4839]: I0321 04:44:41.226645 4839 generic.go:334] "Generic (PLEG): container finished" podID="7cada35b-7e7f-4d22-895f-588b90e48c70" containerID="848904d0e2ac99454595812a77ae5d4f4ec6aacc9198508a3ea49e5fd72d6ee4" exitCode=0 Mar 21 04:44:41 crc kubenswrapper[4839]: I0321 04:44:41.226981 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ts52d" event={"ID":"7cada35b-7e7f-4d22-895f-588b90e48c70","Type":"ContainerDied","Data":"848904d0e2ac99454595812a77ae5d4f4ec6aacc9198508a3ea49e5fd72d6ee4"} Mar 21 04:44:41 crc kubenswrapper[4839]: I0321 04:44:41.255889 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=25.255870796 podStartE2EDuration="25.255870796s" podCreationTimestamp="2026-03-21 04:44:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:39.237951224 +0000 UTC m=+1283.565737910" watchObservedRunningTime="2026-03-21 04:44:41.255870796 +0000 UTC m=+1285.583657472" Mar 21 04:44:41 crc kubenswrapper[4839]: I0321 04:44:41.957611 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.056881 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-combined-ca-bundle\") pod \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.057308 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-scripts\") pod \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.057364 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-config-data\") pod \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.057393 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdb5v\" (UniqueName: \"kubernetes.io/projected/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-kube-api-access-tdb5v\") pod \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.057440 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-logs\") pod \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\" (UID: \"e6e87cbd-1f46-4fa0-9529-8250f9fee21c\") " Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.058272 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-logs" (OuterVolumeSpecName: "logs") pod "e6e87cbd-1f46-4fa0-9529-8250f9fee21c" (UID: "e6e87cbd-1f46-4fa0-9529-8250f9fee21c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.065108 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-scripts" (OuterVolumeSpecName: "scripts") pod "e6e87cbd-1f46-4fa0-9529-8250f9fee21c" (UID: "e6e87cbd-1f46-4fa0-9529-8250f9fee21c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.065490 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-kube-api-access-tdb5v" (OuterVolumeSpecName: "kube-api-access-tdb5v") pod "e6e87cbd-1f46-4fa0-9529-8250f9fee21c" (UID: "e6e87cbd-1f46-4fa0-9529-8250f9fee21c"). InnerVolumeSpecName "kube-api-access-tdb5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.090323 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-config-data" (OuterVolumeSpecName: "config-data") pod "e6e87cbd-1f46-4fa0-9529-8250f9fee21c" (UID: "e6e87cbd-1f46-4fa0-9529-8250f9fee21c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.090759 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6e87cbd-1f46-4fa0-9529-8250f9fee21c" (UID: "e6e87cbd-1f46-4fa0-9529-8250f9fee21c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.159805 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.159845 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.159856 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.159864 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdb5v\" (UniqueName: \"kubernetes.io/projected/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-kube-api-access-tdb5v\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.159876 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e87cbd-1f46-4fa0-9529-8250f9fee21c-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.240786 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wdddk" event={"ID":"e6e87cbd-1f46-4fa0-9529-8250f9fee21c","Type":"ContainerDied","Data":"e10dee2b21cfdb75da16c639d865bd8e8d3823159b603d6fda5a875f34a0fb47"} Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.240806 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wdddk" Mar 21 04:44:42 crc kubenswrapper[4839]: I0321 04:44:42.240826 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e10dee2b21cfdb75da16c639d865bd8e8d3823159b603d6fda5a875f34a0fb47" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.162109 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5788c8f798-khqlb"] Mar 21 04:44:43 crc kubenswrapper[4839]: E0321 04:44:43.162708 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6e87cbd-1f46-4fa0-9529-8250f9fee21c" containerName="placement-db-sync" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.162720 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6e87cbd-1f46-4fa0-9529-8250f9fee21c" containerName="placement-db-sync" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.162891 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6e87cbd-1f46-4fa0-9529-8250f9fee21c" containerName="placement-db-sync" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.163815 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.170249 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.170457 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qhlcg" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.170612 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.170471 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.170745 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.196090 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5788c8f798-khqlb"] Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.278721 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-public-tls-certs\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.278771 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-config-data\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.278790 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-scripts\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.278806 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-logs\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.278848 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-internal-tls-certs\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.278875 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nhpg\" (UniqueName: \"kubernetes.io/projected/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-kube-api-access-4nhpg\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.278927 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-combined-ca-bundle\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.380361 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-public-tls-certs\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.380405 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-config-data\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.380444 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-scripts\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.380465 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-logs\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.380507 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-internal-tls-certs\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.380538 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nhpg\" (UniqueName: \"kubernetes.io/projected/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-kube-api-access-4nhpg\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.380653 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-combined-ca-bundle\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.382253 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-logs\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.385049 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-scripts\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.386295 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-public-tls-certs\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.389003 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-internal-tls-certs\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.389651 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-config-data\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.398856 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-combined-ca-bundle\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.400240 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nhpg\" (UniqueName: \"kubernetes.io/projected/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-kube-api-access-4nhpg\") pod \"placement-5788c8f798-khqlb\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:43 crc kubenswrapper[4839]: I0321 04:44:43.487416 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.209762 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.274280 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-mr2ng"] Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.274522 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" podUID="e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" containerName="dnsmasq-dns" containerID="cri-o://14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598" gracePeriod=10 Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.733110 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.773081 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.784316 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.829441 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-httpd-run\") pod \"6008d784-c50b-4079-a7b4-c160b8202956\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.829517 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wc6r\" (UniqueName: \"kubernetes.io/projected/6008d784-c50b-4079-a7b4-c160b8202956-kube-api-access-7wc6r\") pod \"6008d784-c50b-4079-a7b4-c160b8202956\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.829668 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-scripts\") pod \"6008d784-c50b-4079-a7b4-c160b8202956\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.829738 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"6008d784-c50b-4079-a7b4-c160b8202956\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.829766 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-combined-ca-bundle\") pod \"6008d784-c50b-4079-a7b4-c160b8202956\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.829820 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-config-data\") pod \"6008d784-c50b-4079-a7b4-c160b8202956\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.834664 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-logs\") pod \"6008d784-c50b-4079-a7b4-c160b8202956\" (UID: \"6008d784-c50b-4079-a7b4-c160b8202956\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.833876 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6008d784-c50b-4079-a7b4-c160b8202956" (UID: "6008d784-c50b-4079-a7b4-c160b8202956"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.835360 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-logs" (OuterVolumeSpecName: "logs") pod "6008d784-c50b-4079-a7b4-c160b8202956" (UID: "6008d784-c50b-4079-a7b4-c160b8202956"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.835495 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.835524 4839 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6008d784-c50b-4079-a7b4-c160b8202956-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.860585 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-scripts" (OuterVolumeSpecName: "scripts") pod "6008d784-c50b-4079-a7b4-c160b8202956" (UID: "6008d784-c50b-4079-a7b4-c160b8202956"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.861353 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6008d784-c50b-4079-a7b4-c160b8202956-kube-api-access-7wc6r" (OuterVolumeSpecName: "kube-api-access-7wc6r") pod "6008d784-c50b-4079-a7b4-c160b8202956" (UID: "6008d784-c50b-4079-a7b4-c160b8202956"). InnerVolumeSpecName "kube-api-access-7wc6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.861835 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "6008d784-c50b-4079-a7b4-c160b8202956" (UID: "6008d784-c50b-4079-a7b4-c160b8202956"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.937524 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkzvv\" (UniqueName: \"kubernetes.io/projected/6d0e1745-6e0b-475c-a1de-d049018abea6-kube-api-access-hkzvv\") pod \"6d0e1745-6e0b-475c-a1de-d049018abea6\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.940614 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-db-sync-config-data\") pod \"6d0e1745-6e0b-475c-a1de-d049018abea6\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.940679 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h955\" (UniqueName: \"kubernetes.io/projected/7cada35b-7e7f-4d22-895f-588b90e48c70-kube-api-access-5h955\") pod \"7cada35b-7e7f-4d22-895f-588b90e48c70\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.940706 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-credential-keys\") pod \"7cada35b-7e7f-4d22-895f-588b90e48c70\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.940804 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-combined-ca-bundle\") pod \"6d0e1745-6e0b-475c-a1de-d049018abea6\" (UID: \"6d0e1745-6e0b-475c-a1de-d049018abea6\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.940878 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-config-data\") pod \"7cada35b-7e7f-4d22-895f-588b90e48c70\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.941010 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-combined-ca-bundle\") pod \"7cada35b-7e7f-4d22-895f-588b90e48c70\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.941191 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-fernet-keys\") pod \"7cada35b-7e7f-4d22-895f-588b90e48c70\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.941253 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-scripts\") pod \"7cada35b-7e7f-4d22-895f-588b90e48c70\" (UID: \"7cada35b-7e7f-4d22-895f-588b90e48c70\") " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.942542 4839 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.942562 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wc6r\" (UniqueName: \"kubernetes.io/projected/6008d784-c50b-4079-a7b4-c160b8202956-kube-api-access-7wc6r\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.942603 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.948183 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cada35b-7e7f-4d22-895f-588b90e48c70-kube-api-access-5h955" (OuterVolumeSpecName: "kube-api-access-5h955") pod "7cada35b-7e7f-4d22-895f-588b90e48c70" (UID: "7cada35b-7e7f-4d22-895f-588b90e48c70"). InnerVolumeSpecName "kube-api-access-5h955". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.973527 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7cada35b-7e7f-4d22-895f-588b90e48c70" (UID: "7cada35b-7e7f-4d22-895f-588b90e48c70"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.977215 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6d0e1745-6e0b-475c-a1de-d049018abea6" (UID: "6d0e1745-6e0b-475c-a1de-d049018abea6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.978838 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d0e1745-6e0b-475c-a1de-d049018abea6-kube-api-access-hkzvv" (OuterVolumeSpecName: "kube-api-access-hkzvv") pod "6d0e1745-6e0b-475c-a1de-d049018abea6" (UID: "6d0e1745-6e0b-475c-a1de-d049018abea6"). InnerVolumeSpecName "kube-api-access-hkzvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.984342 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7cada35b-7e7f-4d22-895f-588b90e48c70" (UID: "7cada35b-7e7f-4d22-895f-588b90e48c70"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:44 crc kubenswrapper[4839]: I0321 04:44:44.991956 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-scripts" (OuterVolumeSpecName: "scripts") pod "7cada35b-7e7f-4d22-895f-588b90e48c70" (UID: "7cada35b-7e7f-4d22-895f-588b90e48c70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.014295 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.044321 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkzvv\" (UniqueName: \"kubernetes.io/projected/6d0e1745-6e0b-475c-a1de-d049018abea6-kube-api-access-hkzvv\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.044352 4839 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.044366 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h955\" (UniqueName: \"kubernetes.io/projected/7cada35b-7e7f-4d22-895f-588b90e48c70-kube-api-access-5h955\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.044378 4839 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.044389 4839 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.044400 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.050377 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.100093 4839 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.146627 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-scripts\") pod \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.146705 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-combined-ca-bundle\") pod \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.146704 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5788c8f798-khqlb"] Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.146853 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdqdc\" (UniqueName: \"kubernetes.io/projected/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-kube-api-access-zdqdc\") pod \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.148439 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-logs\") pod \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.148586 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.148882 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-config-data\") pod \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.149063 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-svc\") pod \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.149292 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-sb\") pod \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.150670 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7fdg\" (UniqueName: \"kubernetes.io/projected/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-kube-api-access-n7fdg\") pod \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.150720 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-swift-storage-0\") pod \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.150776 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-nb\") pod \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.151193 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-config\") pod \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\" (UID: \"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.151237 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-httpd-run\") pod \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\" (UID: \"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6\") " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.153279 4839 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.154396 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" (UID: "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.156450 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cada35b-7e7f-4d22-895f-588b90e48c70" (UID: "7cada35b-7e7f-4d22-895f-588b90e48c70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.156861 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-logs" (OuterVolumeSpecName: "logs") pod "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" (UID: "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.165223 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-kube-api-access-zdqdc" (OuterVolumeSpecName: "kube-api-access-zdqdc") pod "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" (UID: "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6"). InnerVolumeSpecName "kube-api-access-zdqdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.178461 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-kube-api-access-n7fdg" (OuterVolumeSpecName: "kube-api-access-n7fdg") pod "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" (UID: "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb"). InnerVolumeSpecName "kube-api-access-n7fdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.178804 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d0e1745-6e0b-475c-a1de-d049018abea6" (UID: "6d0e1745-6e0b-475c-a1de-d049018abea6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.181073 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" (UID: "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.182695 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-scripts" (OuterVolumeSpecName: "scripts") pod "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" (UID: "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.182730 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-config-data" (OuterVolumeSpecName: "config-data") pod "7cada35b-7e7f-4d22-895f-588b90e48c70" (UID: "7cada35b-7e7f-4d22-895f-588b90e48c70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.195669 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6008d784-c50b-4079-a7b4-c160b8202956" (UID: "6008d784-c50b-4079-a7b4-c160b8202956"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.200512 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-config-data" (OuterVolumeSpecName: "config-data") pod "6008d784-c50b-4079-a7b4-c160b8202956" (UID: "6008d784-c50b-4079-a7b4-c160b8202956"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.226070 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" (UID: "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255167 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255205 4839 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255218 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255228 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255243 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0e1745-6e0b-475c-a1de-d049018abea6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255254 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdqdc\" (UniqueName: \"kubernetes.io/projected/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-kube-api-access-zdqdc\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255266 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255276 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255300 4839 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255313 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cada35b-7e7f-4d22-895f-588b90e48c70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255326 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7fdg\" (UniqueName: \"kubernetes.io/projected/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-kube-api-access-n7fdg\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.255339 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6008d784-c50b-4079-a7b4-c160b8202956-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.262814 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" (UID: "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.274498 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98964f649-mrjrt" event={"ID":"e965d008-890b-408c-a5a8-823aca00140a","Type":"ContainerStarted","Data":"6e416952cf65a99f24d43cb637a81bb2e071806b75507c88029a3d669986edf2"} Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.279885 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5788c8f798-khqlb" event={"ID":"30c2fe46-cd8a-43f9-8968-b6e65d7c862a","Type":"ContainerStarted","Data":"3897ef34a5a560221b0da70d53a0118dcc2423f236d8ea84230926286a71f6ee"} Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.282173 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-config" (OuterVolumeSpecName: "config") pod "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" (UID: "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.282701 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-config-data" (OuterVolumeSpecName: "config-data") pod "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" (UID: "3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.285507 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-t8kxj" event={"ID":"6d0e1745-6e0b-475c-a1de-d049018abea6","Type":"ContainerDied","Data":"00637144ea664e135a3a03c08667ad9ad9e5c84e3814ae65ec02c62e19d9549d"} Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.285558 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00637144ea664e135a3a03c08667ad9ad9e5c84e3814ae65ec02c62e19d9549d" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.285638 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-t8kxj" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.286509 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" (UID: "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.290212 4839 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.296833 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6","Type":"ContainerDied","Data":"895e0558e8de36ac56f79042a259db632c9fdc941b4ad6158154e4d29f6f1e2e"} Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.296835 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.296915 4839 scope.go:117] "RemoveContainer" containerID="169f86351d3c6f2a1f7c5f547a5ee33563ee9e6728a88fde0e0f1b926cb83f77" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.307826 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ts52d" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.308510 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ts52d" event={"ID":"7cada35b-7e7f-4d22-895f-588b90e48c70","Type":"ContainerDied","Data":"e31bc068934ef91e47e0dcf3fab8f1d0d0da5df66baa85de8d2947608e34dbb6"} Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.308604 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e31bc068934ef91e47e0dcf3fab8f1d0d0da5df66baa85de8d2947608e34dbb6" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.314039 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6008d784-c50b-4079-a7b4-c160b8202956","Type":"ContainerDied","Data":"bc5e3457eb4db025b9a82cb5d0e3fd43071910df3cfe4e9dea88ba3c8fb8cc99"} Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.314116 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.317299 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" (UID: "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.320027 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c266726-5bfd-4519-bdd5-9db7f6a77df4","Type":"ContainerStarted","Data":"97884e844baf73e80ed5f7a5c51d988d7d7009365523dceffa2a7bc9d1e19948"} Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.321410 4839 generic.go:334] "Generic (PLEG): container finished" podID="e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" containerID="14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598" exitCode=0 Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.321436 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" event={"ID":"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb","Type":"ContainerDied","Data":"14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598"} Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.321451 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" event={"ID":"e8a38db0-ab1b-4555-bb52-a903e9c6b5bb","Type":"ContainerDied","Data":"2dbe5499f5ce6b46711307e191213d3557376716206b4f9aec95cbff6dcd4f72"} Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.321495 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-mr2ng" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.327595 4839 scope.go:117] "RemoveContainer" containerID="c262061b12e953fbbacb47f4b9530de8433e420c520843fb5f6b2637c033c0d3" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.334534 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" (UID: "e8a38db0-ab1b-4555-bb52-a903e9c6b5bb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.357123 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.390064 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.390095 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.390106 4839 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.390116 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.390124 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.390132 4839 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.390141 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.394708 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.414273 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.450482 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.466610 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:44:45 crc kubenswrapper[4839]: E0321 04:44:45.467005 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6008d784-c50b-4079-a7b4-c160b8202956" containerName="glance-log" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467018 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6008d784-c50b-4079-a7b4-c160b8202956" containerName="glance-log" Mar 21 04:44:45 crc kubenswrapper[4839]: E0321 04:44:45.467033 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6008d784-c50b-4079-a7b4-c160b8202956" containerName="glance-httpd" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467039 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6008d784-c50b-4079-a7b4-c160b8202956" containerName="glance-httpd" Mar 21 04:44:45 crc kubenswrapper[4839]: E0321 04:44:45.467047 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" containerName="glance-httpd" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467054 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" containerName="glance-httpd" Mar 21 04:44:45 crc kubenswrapper[4839]: E0321 04:44:45.467068 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" containerName="glance-log" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467074 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" containerName="glance-log" Mar 21 04:44:45 crc kubenswrapper[4839]: E0321 04:44:45.467088 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" containerName="dnsmasq-dns" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467094 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" containerName="dnsmasq-dns" Mar 21 04:44:45 crc kubenswrapper[4839]: E0321 04:44:45.467108 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" containerName="init" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467114 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" containerName="init" Mar 21 04:44:45 crc kubenswrapper[4839]: E0321 04:44:45.467125 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d0e1745-6e0b-475c-a1de-d049018abea6" containerName="barbican-db-sync" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467132 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d0e1745-6e0b-475c-a1de-d049018abea6" containerName="barbican-db-sync" Mar 21 04:44:45 crc kubenswrapper[4839]: E0321 04:44:45.467145 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cada35b-7e7f-4d22-895f-588b90e48c70" containerName="keystone-bootstrap" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467153 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cada35b-7e7f-4d22-895f-588b90e48c70" containerName="keystone-bootstrap" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467295 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cada35b-7e7f-4d22-895f-588b90e48c70" containerName="keystone-bootstrap" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467321 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6008d784-c50b-4079-a7b4-c160b8202956" containerName="glance-log" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467338 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" containerName="glance-log" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467349 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" containerName="glance-httpd" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467359 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" containerName="dnsmasq-dns" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467366 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d0e1745-6e0b-475c-a1de-d049018abea6" containerName="barbican-db-sync" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.467380 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6008d784-c50b-4079-a7b4-c160b8202956" containerName="glance-httpd" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.468290 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.474834 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.475719 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.475879 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.476034 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-v5bc4" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.485077 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.485869 4839 scope.go:117] "RemoveContainer" containerID="0a86b207740d0f8a7f90c03c8833a6d77c693f85ec7fea71ebc1f6a1cbaaac94" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.491997 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.492076 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.492097 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgs9m\" (UniqueName: \"kubernetes.io/projected/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-kube-api-access-kgs9m\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.492116 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.492209 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-logs\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.492246 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-scripts\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.492315 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-config-data\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.492354 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.500498 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.506545 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.508472 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.509076 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.525035 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.569347 4839 scope.go:117] "RemoveContainer" containerID="656d2bd6e96a6dfef03b94b31d13a8f2dda820b33aef3803e391faf4b5d221eb" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.594102 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.594155 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.594188 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.594203 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgs9m\" (UniqueName: \"kubernetes.io/projected/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-kube-api-access-kgs9m\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.594218 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.594242 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-logs\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.594265 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-scripts\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.594319 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-config-data\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.594887 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.595714 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-logs\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.596522 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.602038 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-config-data\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.607326 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.607380 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-scripts\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.616365 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.621560 4839 scope.go:117] "RemoveContainer" containerID="14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.628433 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgs9m\" (UniqueName: \"kubernetes.io/projected/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-kube-api-access-kgs9m\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.644921 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.695584 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.695633 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfhfr\" (UniqueName: \"kubernetes.io/projected/506e1e04-5787-48bb-9165-96a55f0d3095-kube-api-access-tfhfr\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.695672 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-config-data\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.695699 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-logs\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.695836 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.695861 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.695882 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.695915 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-scripts\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.787699 4839 scope.go:117] "RemoveContainer" containerID="52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.795774 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-mr2ng"] Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.807494 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-config-data\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.807585 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-logs\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.807838 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.807877 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.807905 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.807944 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-scripts\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.807975 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.807997 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfhfr\" (UniqueName: \"kubernetes.io/projected/506e1e04-5787-48bb-9165-96a55f0d3095-kube-api-access-tfhfr\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.818635 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.819489 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-config-data\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.820040 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.827667 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-logs\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.828188 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-scripts\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.845689 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-mr2ng"] Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.857800 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.859627 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.871132 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.874643 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfhfr\" (UniqueName: \"kubernetes.io/projected/506e1e04-5787-48bb-9165-96a55f0d3095-kube-api-access-tfhfr\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.893753 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.914314 4839 scope.go:117] "RemoveContainer" containerID="14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598" Mar 21 04:44:45 crc kubenswrapper[4839]: E0321 04:44:45.918940 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598\": container with ID starting with 14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598 not found: ID does not exist" containerID="14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.918984 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598"} err="failed to get container status \"14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598\": rpc error: code = NotFound desc = could not find container \"14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598\": container with ID starting with 14b1c90fa0f6d90a07ec7699dcaccca8b3deb82e4f9b4495708e06fb5363f598 not found: ID does not exist" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.919008 4839 scope.go:117] "RemoveContainer" containerID="52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349" Mar 21 04:44:45 crc kubenswrapper[4839]: E0321 04:44:45.925173 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349\": container with ID starting with 52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349 not found: ID does not exist" containerID="52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349" Mar 21 04:44:45 crc kubenswrapper[4839]: I0321 04:44:45.925214 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349"} err="failed to get container status \"52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349\": rpc error: code = NotFound desc = could not find container \"52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349\": container with ID starting with 52b2f2f770bf2eb0997bcf2aef429b0705724f1b09c55ddcbec469134135b349 not found: ID does not exist" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.005461 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-77466dd775-brs5x"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.007060 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.011252 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qnmpn" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.011372 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.011444 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.033394 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-56b894998b-l59vx"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.034816 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.037532 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.066628 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-77466dd775-brs5x"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.118486 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data-custom\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.118542 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-combined-ca-bundle\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.118608 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data-custom\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.118635 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.118671 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmt9m\" (UniqueName: \"kubernetes.io/projected/392fb516-8745-40fb-b38d-53106c8310df-kube-api-access-xmt9m\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.118691 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/392fb516-8745-40fb-b38d-53106c8310df-logs\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.118727 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n782h\" (UniqueName: \"kubernetes.io/projected/e5f64e49-61a6-4601-b37b-f9af6079108c-kube-api-access-n782h\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.118767 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-combined-ca-bundle\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.118800 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.118814 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5f64e49-61a6-4601-b37b-f9af6079108c-logs\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.132620 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-56b894998b-l59vx"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.166163 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-k67ln"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.167881 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.181305 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cb996784d-fvhvp"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.181957 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.184314 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.191100 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pzsvm" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.191374 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.191617 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.191853 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.191992 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.192147 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.203947 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-k67ln"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.223424 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cb996784d-fvhvp"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.223741 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.223798 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5f64e49-61a6-4601-b37b-f9af6079108c-logs\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.223844 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data-custom\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.223880 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-combined-ca-bundle\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.223967 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data-custom\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.224017 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.224082 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmt9m\" (UniqueName: \"kubernetes.io/projected/392fb516-8745-40fb-b38d-53106c8310df-kube-api-access-xmt9m\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.224128 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/392fb516-8745-40fb-b38d-53106c8310df-logs\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.224205 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n782h\" (UniqueName: \"kubernetes.io/projected/e5f64e49-61a6-4601-b37b-f9af6079108c-kube-api-access-n782h\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.224287 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-combined-ca-bundle\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.226919 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/392fb516-8745-40fb-b38d-53106c8310df-logs\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.227677 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5f64e49-61a6-4601-b37b-f9af6079108c-logs\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.230296 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data-custom\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.230528 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data-custom\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.231222 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.234270 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-combined-ca-bundle\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.234992 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.235775 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-combined-ca-bundle\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.260516 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmt9m\" (UniqueName: \"kubernetes.io/projected/392fb516-8745-40fb-b38d-53106c8310df-kube-api-access-xmt9m\") pod \"barbican-worker-77466dd775-brs5x\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.281475 4839 scope.go:117] "RemoveContainer" containerID="3564e41aa34a1722e5c61a5b47bf82e1bb5bc4612fbb2dc888f7e8b1d996cdd6" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.287718 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n782h\" (UniqueName: \"kubernetes.io/projected/e5f64e49-61a6-4601-b37b-f9af6079108c-kube-api-access-n782h\") pod \"barbican-keystone-listener-56b894998b-l59vx\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.295435 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-59cc78c49d-pkvb5"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.299808 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.309353 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.321964 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59cc78c49d-pkvb5"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325633 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-credential-keys\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325702 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn7xh\" (UniqueName: \"kubernetes.io/projected/ac45c53b-2486-47d1-aaf4-23b76adfd431-kube-api-access-xn7xh\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325721 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-combined-ca-bundle\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325754 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-fernet-keys\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325771 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-config-data\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325794 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-config\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325809 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325834 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-scripts\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325854 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325877 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325902 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-internal-tls-certs\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325932 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.325977 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qvlb\" (UniqueName: \"kubernetes.io/projected/6a3fcdf0-3099-467b-928b-89a4876130fe-kube-api-access-6qvlb\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.326019 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-public-tls-certs\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.376603 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98964f649-mrjrt" event={"ID":"e965d008-890b-408c-a5a8-823aca00140a","Type":"ContainerStarted","Data":"c52f7b158358ef8b38cfac03210bf15a4ca76a8dbb9c567dbd73763e507062d1"} Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.376825 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.386815 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.389720 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.399877 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5788c8f798-khqlb" event={"ID":"30c2fe46-cd8a-43f9-8968-b6e65d7c862a","Type":"ContainerStarted","Data":"ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5"} Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.400271 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5788c8f798-khqlb" event={"ID":"30c2fe46-cd8a-43f9-8968-b6e65d7c862a","Type":"ContainerStarted","Data":"50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345"} Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.401167 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.403044 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.412239 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7b946d96f4-chv76"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.413903 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428176 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-combined-ca-bundle\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428219 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-fernet-keys\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428242 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-config-data\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428269 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-config\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428286 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428311 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-scripts\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428328 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsv2l\" (UniqueName: \"kubernetes.io/projected/08c48ff3-782c-4f5a-8a20-0736565e247a-kube-api-access-lsv2l\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428352 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428372 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428391 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-internal-tls-certs\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428409 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428448 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qvlb\" (UniqueName: \"kubernetes.io/projected/6a3fcdf0-3099-467b-928b-89a4876130fe-kube-api-access-6qvlb\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428488 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-public-tls-certs\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428519 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data-custom\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428542 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-credential-keys\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428558 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428672 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn7xh\" (UniqueName: \"kubernetes.io/projected/ac45c53b-2486-47d1-aaf4-23b76adfd431-kube-api-access-xn7xh\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428691 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-combined-ca-bundle\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.428714 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c48ff3-782c-4f5a-8a20-0736565e247a-logs\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.431643 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.432253 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.432858 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.433190 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-config\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.434043 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.435074 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-fernet-keys\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.437267 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-config-data\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.439073 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-scripts\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.439145 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-db77b8b5f-grbp8"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.440244 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-credential-keys\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.442124 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.451968 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-public-tls-certs\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.457206 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-combined-ca-bundle\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.480950 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a3fcdf0-3099-467b-928b-89a4876130fe-internal-tls-certs\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.487276 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn7xh\" (UniqueName: \"kubernetes.io/projected/ac45c53b-2486-47d1-aaf4-23b76adfd431-kube-api-access-xn7xh\") pod \"dnsmasq-dns-848cf88cfc-k67ln\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.490268 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qvlb\" (UniqueName: \"kubernetes.io/projected/6a3fcdf0-3099-467b-928b-89a4876130fe-kube-api-access-6qvlb\") pod \"keystone-cb996784d-fvhvp\" (UID: \"6a3fcdf0-3099-467b-928b-89a4876130fe\") " pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.490704 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.511074 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.528167 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-98964f649-mrjrt" podStartSLOduration=10.528147519000001 podStartE2EDuration="10.528147519s" podCreationTimestamp="2026-03-21 04:44:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:46.457365399 +0000 UTC m=+1290.785152075" watchObservedRunningTime="2026-03-21 04:44:46.528147519 +0000 UTC m=+1290.855934195" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.532804 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3563c0f9-9e82-4798-bae3-b3836a6b5866-config-data\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.532918 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6e03301-fb6e-467b-b19d-21b5c475d35c-config-data-custom\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.532937 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3563c0f9-9e82-4798-bae3-b3836a6b5866-config-data-custom\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.532965 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e03301-fb6e-467b-b19d-21b5c475d35c-config-data\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.532989 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lbgq\" (UniqueName: \"kubernetes.io/projected/3563c0f9-9e82-4798-bae3-b3836a6b5866-kube-api-access-2lbgq\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.533031 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data-custom\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.533062 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e03301-fb6e-467b-b19d-21b5c475d35c-combined-ca-bundle\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.533092 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.533154 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c48ff3-782c-4f5a-8a20-0736565e247a-logs\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.533206 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-combined-ca-bundle\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.533252 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e03301-fb6e-467b-b19d-21b5c475d35c-logs\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.533319 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsv2l\" (UniqueName: \"kubernetes.io/projected/08c48ff3-782c-4f5a-8a20-0736565e247a-kube-api-access-lsv2l\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.533342 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3563c0f9-9e82-4798-bae3-b3836a6b5866-combined-ca-bundle\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.533404 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4r45\" (UniqueName: \"kubernetes.io/projected/e6e03301-fb6e-467b-b19d-21b5c475d35c-kube-api-access-z4r45\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.533450 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3563c0f9-9e82-4798-bae3-b3836a6b5866-logs\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.543849 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c48ff3-782c-4f5a-8a20-0736565e247a-logs\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.550488 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-combined-ca-bundle\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.574148 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.575733 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data-custom\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.587732 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6" path="/var/lib/kubelet/pods/3d7f6c0e-7fdd-4ed0-94ad-e1b044f296f6/volumes" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.588523 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6008d784-c50b-4079-a7b4-c160b8202956" path="/var/lib/kubelet/pods/6008d784-c50b-4079-a7b4-c160b8202956/volumes" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.588703 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsv2l\" (UniqueName: \"kubernetes.io/projected/08c48ff3-782c-4f5a-8a20-0736565e247a-kube-api-access-lsv2l\") pod \"barbican-api-59cc78c49d-pkvb5\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.601534 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8a38db0-ab1b-4555-bb52-a903e9c6b5bb" path="/var/lib/kubelet/pods/e8a38db0-ab1b-4555-bb52-a903e9c6b5bb/volumes" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.620387 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7b946d96f4-chv76"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.620434 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-db77b8b5f-grbp8"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.620874 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.641033 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3563c0f9-9e82-4798-bae3-b3836a6b5866-logs\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.641105 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3563c0f9-9e82-4798-bae3-b3836a6b5866-config-data\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.689725 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6e03301-fb6e-467b-b19d-21b5c475d35c-config-data-custom\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.690044 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3563c0f9-9e82-4798-bae3-b3836a6b5866-config-data-custom\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.690079 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e03301-fb6e-467b-b19d-21b5c475d35c-config-data\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.690118 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lbgq\" (UniqueName: \"kubernetes.io/projected/3563c0f9-9e82-4798-bae3-b3836a6b5866-kube-api-access-2lbgq\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.690184 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e03301-fb6e-467b-b19d-21b5c475d35c-combined-ca-bundle\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.690351 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e03301-fb6e-467b-b19d-21b5c475d35c-logs\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.690417 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3563c0f9-9e82-4798-bae3-b3836a6b5866-combined-ca-bundle\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.690465 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4r45\" (UniqueName: \"kubernetes.io/projected/e6e03301-fb6e-467b-b19d-21b5c475d35c-kube-api-access-z4r45\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.693486 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3563c0f9-9e82-4798-bae3-b3836a6b5866-logs\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.704854 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e03301-fb6e-467b-b19d-21b5c475d35c-logs\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.707883 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5788c8f798-khqlb" podStartSLOduration=3.707869757 podStartE2EDuration="3.707869757s" podCreationTimestamp="2026-03-21 04:44:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:46.535220937 +0000 UTC m=+1290.863007613" watchObservedRunningTime="2026-03-21 04:44:46.707869757 +0000 UTC m=+1291.035656433" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.708100 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e03301-fb6e-467b-b19d-21b5c475d35c-config-data\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.724401 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4r45\" (UniqueName: \"kubernetes.io/projected/e6e03301-fb6e-467b-b19d-21b5c475d35c-kube-api-access-z4r45\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.745151 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lbgq\" (UniqueName: \"kubernetes.io/projected/3563c0f9-9e82-4798-bae3-b3836a6b5866-kube-api-access-2lbgq\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.750370 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e03301-fb6e-467b-b19d-21b5c475d35c-combined-ca-bundle\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.755707 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3563c0f9-9e82-4798-bae3-b3836a6b5866-config-data\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.760485 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3563c0f9-9e82-4798-bae3-b3836a6b5866-config-data-custom\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.767136 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6e03301-fb6e-467b-b19d-21b5c475d35c-config-data-custom\") pod \"barbican-keystone-listener-7b946d96f4-chv76\" (UID: \"e6e03301-fb6e-467b-b19d-21b5c475d35c\") " pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.768761 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-67dd687666-pgfc5"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.775020 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3563c0f9-9e82-4798-bae3-b3836a6b5866-combined-ca-bundle\") pod \"barbican-worker-db77b8b5f-grbp8\" (UID: \"3563c0f9-9e82-4798-bae3-b3836a6b5866\") " pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.802460 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-67dd687666-pgfc5"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.802559 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.805143 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c8735c-f1c9-40f7-bd34-60bb0749bc23-logs\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.805221 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.805259 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj9bx\" (UniqueName: \"kubernetes.io/projected/10c8735c-f1c9-40f7-bd34-60bb0749bc23-kube-api-access-pj9bx\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.805376 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data-custom\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.805402 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-combined-ca-bundle\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.906660 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.906739 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj9bx\" (UniqueName: \"kubernetes.io/projected/10c8735c-f1c9-40f7-bd34-60bb0749bc23-kube-api-access-pj9bx\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.906813 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data-custom\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.906838 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-combined-ca-bundle\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.908842 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c8735c-f1c9-40f7-bd34-60bb0749bc23-logs\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.909444 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c8735c-f1c9-40f7-bd34-60bb0749bc23-logs\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.913001 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data-custom\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.929401 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.946500 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-combined-ca-bundle\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.952301 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.967278 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj9bx\" (UniqueName: \"kubernetes.io/projected/10c8735c-f1c9-40f7-bd34-60bb0749bc23-kube-api-access-pj9bx\") pod \"barbican-api-67dd687666-pgfc5\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:46 crc kubenswrapper[4839]: I0321 04:44:46.995636 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.009891 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-db77b8b5f-grbp8" Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.127202 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.136199 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.259172 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-56b894998b-l59vx"] Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.408377 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cb996784d-fvhvp"] Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.429863 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-77466dd775-brs5x"] Mar 21 04:44:47 crc kubenswrapper[4839]: W0321 04:44:47.431689 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod392fb516_8745_40fb_b38d_53106c8310df.slice/crio-aaed11a9eedbf42719456ee569b2249eb2ed204275c2e327bc69e0e9ba6d0c2e WatchSource:0}: Error finding container aaed11a9eedbf42719456ee569b2249eb2ed204275c2e327bc69e0e9ba6d0c2e: Status 404 returned error can't find the container with id aaed11a9eedbf42719456ee569b2249eb2ed204275c2e327bc69e0e9ba6d0c2e Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.607533 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59cc78c49d-pkvb5"] Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.624904 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-k67ln"] Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.635589 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qfjms" event={"ID":"6000d2d4-e84a-443f-9094-ab999541331d","Type":"ContainerStarted","Data":"89d53502805454e28eeac8a6f5794fb2c1a2eba3acba95c45fd4f0d839ae56ac"} Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.661909 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-qfjms" podStartSLOduration=4.662872252 podStartE2EDuration="48.661888245s" podCreationTimestamp="2026-03-21 04:43:59 +0000 UTC" firstStartedPulling="2026-03-21 04:44:01.030684456 +0000 UTC m=+1245.358471132" lastFinishedPulling="2026-03-21 04:44:45.029700459 +0000 UTC m=+1289.357487125" observedRunningTime="2026-03-21 04:44:47.658952143 +0000 UTC m=+1291.986738839" watchObservedRunningTime="2026-03-21 04:44:47.661888245 +0000 UTC m=+1291.989674921" Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.673020 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77466dd775-brs5x" event={"ID":"392fb516-8745-40fb-b38d-53106c8310df","Type":"ContainerStarted","Data":"aaed11a9eedbf42719456ee569b2249eb2ed204275c2e327bc69e0e9ba6d0c2e"} Mar 21 04:44:47 crc kubenswrapper[4839]: W0321 04:44:47.675585 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08c48ff3_782c_4f5a_8a20_0736565e247a.slice/crio-75d6eae50b83f1f8ec52b3d138f4404879d9130c0a03aa11c82efdf2866d9d7d WatchSource:0}: Error finding container 75d6eae50b83f1f8ec52b3d138f4404879d9130c0a03aa11c82efdf2866d9d7d: Status 404 returned error can't find the container with id 75d6eae50b83f1f8ec52b3d138f4404879d9130c0a03aa11c82efdf2866d9d7d Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.676953 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"524772c8-3fdb-43dc-8532-1d8e9dcdeb97","Type":"ContainerStarted","Data":"d1e4b5b263d8711e41038cc9c72c0cf72e4c984c445036bd50e6733715123ea1"} Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.689945 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cb996784d-fvhvp" event={"ID":"6a3fcdf0-3099-467b-928b-89a4876130fe","Type":"ContainerStarted","Data":"bd36894c5fe54247c8be01f9d997698010ac96edc9b8aff94a62d4d015134c51"} Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.708852 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" event={"ID":"e5f64e49-61a6-4601-b37b-f9af6079108c","Type":"ContainerStarted","Data":"5a91be0fc03b74a93a2e4a4a9276bbbc5ad15e2d8798017231a484ecfe7f2ff6"} Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.715385 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"506e1e04-5787-48bb-9165-96a55f0d3095","Type":"ContainerStarted","Data":"b3b20c1c58919d92014e2b8b23b7b38a20303479312dd8c6c51224fcd3f18728"} Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.917173 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-db77b8b5f-grbp8"] Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.961714 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7b946d96f4-chv76"] Mar 21 04:44:47 crc kubenswrapper[4839]: I0321 04:44:47.981262 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-67dd687666-pgfc5"] Mar 21 04:44:48 crc kubenswrapper[4839]: W0321 04:44:48.076920 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10c8735c_f1c9_40f7_bd34_60bb0749bc23.slice/crio-eef1fbdee77ab0e9e93f444be08641661339d7257383b84dc3aa743e29072b31 WatchSource:0}: Error finding container eef1fbdee77ab0e9e93f444be08641661339d7257383b84dc3aa743e29072b31: Status 404 returned error can't find the container with id eef1fbdee77ab0e9e93f444be08641661339d7257383b84dc3aa743e29072b31 Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.755793 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59cc78c49d-pkvb5" event={"ID":"08c48ff3-782c-4f5a-8a20-0736565e247a","Type":"ContainerStarted","Data":"006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9"} Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.756140 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59cc78c49d-pkvb5" event={"ID":"08c48ff3-782c-4f5a-8a20-0736565e247a","Type":"ContainerStarted","Data":"75d6eae50b83f1f8ec52b3d138f4404879d9130c0a03aa11c82efdf2866d9d7d"} Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.763037 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cb996784d-fvhvp" event={"ID":"6a3fcdf0-3099-467b-928b-89a4876130fe","Type":"ContainerStarted","Data":"995e93aab27ceb2b4a1d4104b2418622a19b1beaed4ed2fe4d5c82d14a2d55d4"} Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.763112 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.778439 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"524772c8-3fdb-43dc-8532-1d8e9dcdeb97","Type":"ContainerStarted","Data":"7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc"} Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.790617 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cb996784d-fvhvp" podStartSLOduration=2.790594172 podStartE2EDuration="2.790594172s" podCreationTimestamp="2026-03-21 04:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:48.779207773 +0000 UTC m=+1293.106994449" watchObservedRunningTime="2026-03-21 04:44:48.790594172 +0000 UTC m=+1293.118380848" Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.791697 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-db77b8b5f-grbp8" event={"ID":"3563c0f9-9e82-4798-bae3-b3836a6b5866","Type":"ContainerStarted","Data":"70074b5be9f0a546822a32bf575056e24fb92673a44bc210c38a301ddd216343"} Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.794861 4839 generic.go:334] "Generic (PLEG): container finished" podID="ac45c53b-2486-47d1-aaf4-23b76adfd431" containerID="d6b588c7e0ea2083a499b261b5c79b627db47988e3cf9da2d927f03f127f5e76" exitCode=0 Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.794912 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" event={"ID":"ac45c53b-2486-47d1-aaf4-23b76adfd431","Type":"ContainerDied","Data":"d6b588c7e0ea2083a499b261b5c79b627db47988e3cf9da2d927f03f127f5e76"} Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.794935 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" event={"ID":"ac45c53b-2486-47d1-aaf4-23b76adfd431","Type":"ContainerStarted","Data":"2f1d63d47cab7235a54c65fca44b344c795fcf2a4d8ecfd84ead6214de4729cf"} Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.808918 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67dd687666-pgfc5" event={"ID":"10c8735c-f1c9-40f7-bd34-60bb0749bc23","Type":"ContainerStarted","Data":"5b137b53eba217c749f810e3fe6d4536182b4cec7923324d43b649cbc888ca03"} Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.809240 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67dd687666-pgfc5" event={"ID":"10c8735c-f1c9-40f7-bd34-60bb0749bc23","Type":"ContainerStarted","Data":"eef1fbdee77ab0e9e93f444be08641661339d7257383b84dc3aa743e29072b31"} Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.832544 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" event={"ID":"e6e03301-fb6e-467b-b19d-21b5c475d35c","Type":"ContainerStarted","Data":"fca5e18e77c121faaaf0ada8e901cc40df5819da429b90285bcb8be9c693b177"} Mar 21 04:44:48 crc kubenswrapper[4839]: I0321 04:44:48.841321 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"506e1e04-5787-48bb-9165-96a55f0d3095","Type":"ContainerStarted","Data":"688009d7356d78e3eb36a5befafccac32153750022bd8fbc6ea8dbee86aced35"} Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.266359 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-84c6c985f8-v5cmh" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.329828 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-9c97f4dbd-k2scs" podUID="579308eb-854d-4160-ad35-8677f2d0e634" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.494927 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-59cc78c49d-pkvb5"] Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.522708 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-d9cf4c794-jb7lf"] Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.525738 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.527853 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.528117 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.544152 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d9cf4c794-jb7lf"] Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.633158 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzwb8\" (UniqueName: \"kubernetes.io/projected/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-kube-api-access-dzwb8\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.633327 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-config-data-custom\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.633435 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-config-data\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.633508 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-internal-tls-certs\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.633534 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-logs\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.633600 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-combined-ca-bundle\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.633622 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-public-tls-certs\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.735583 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-config-data\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.735636 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-internal-tls-certs\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.735661 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-logs\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.735709 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-combined-ca-bundle\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.735729 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-public-tls-certs\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.735801 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzwb8\" (UniqueName: \"kubernetes.io/projected/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-kube-api-access-dzwb8\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.735848 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-config-data-custom\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.736631 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-logs\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.743682 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-combined-ca-bundle\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.747490 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-config-data\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.749951 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-public-tls-certs\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.751702 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-internal-tls-certs\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.753192 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-config-data-custom\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.755199 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzwb8\" (UniqueName: \"kubernetes.io/projected/37ba14c5-dfc7-4268-86c9-c0efe37fe6c9-kube-api-access-dzwb8\") pod \"barbican-api-d9cf4c794-jb7lf\" (UID: \"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9\") " pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.862184 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"506e1e04-5787-48bb-9165-96a55f0d3095","Type":"ContainerStarted","Data":"9d897b01178474175025269c566e1858192f12c1b5756dd643a41a358a91f169"} Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.867696 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59cc78c49d-pkvb5" event={"ID":"08c48ff3-782c-4f5a-8a20-0736565e247a","Type":"ContainerStarted","Data":"3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b"} Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.867772 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.868133 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.878632 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"524772c8-3fdb-43dc-8532-1d8e9dcdeb97","Type":"ContainerStarted","Data":"f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34"} Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.878835 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.890254 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.890238025 podStartE2EDuration="4.890238025s" podCreationTimestamp="2026-03-21 04:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:49.884109453 +0000 UTC m=+1294.211896129" watchObservedRunningTime="2026-03-21 04:44:49.890238025 +0000 UTC m=+1294.218024701" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.900392 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" event={"ID":"ac45c53b-2486-47d1-aaf4-23b76adfd431","Type":"ContainerStarted","Data":"d7d86bc6d96470a04c1fc681cf73561b422455dc884417b0677e9ae418f682f0"} Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.901207 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.906886 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-59cc78c49d-pkvb5" podStartSLOduration=3.9068737799999997 podStartE2EDuration="3.90687378s" podCreationTimestamp="2026-03-21 04:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:49.905620145 +0000 UTC m=+1294.233406821" watchObservedRunningTime="2026-03-21 04:44:49.90687378 +0000 UTC m=+1294.234660456" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.912986 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67dd687666-pgfc5" event={"ID":"10c8735c-f1c9-40f7-bd34-60bb0749bc23","Type":"ContainerStarted","Data":"77ad1711dc27b34bdfe011c55c79bdc7d6ef8a5e0c42e7951254c65ef19efa51"} Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.913054 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.913342 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.935790 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.935761508 podStartE2EDuration="4.935761508s" podCreationTimestamp="2026-03-21 04:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:49.933827594 +0000 UTC m=+1294.261614270" watchObservedRunningTime="2026-03-21 04:44:49.935761508 +0000 UTC m=+1294.263548184" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.959926 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" podStartSLOduration=3.959909934 podStartE2EDuration="3.959909934s" podCreationTimestamp="2026-03-21 04:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:49.953502344 +0000 UTC m=+1294.281289020" watchObservedRunningTime="2026-03-21 04:44:49.959909934 +0000 UTC m=+1294.287696610" Mar 21 04:44:49 crc kubenswrapper[4839]: I0321 04:44:49.981971 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-67dd687666-pgfc5" podStartSLOduration=3.98194831 podStartE2EDuration="3.98194831s" podCreationTimestamp="2026-03-21 04:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:49.973312109 +0000 UTC m=+1294.301098795" watchObservedRunningTime="2026-03-21 04:44:49.98194831 +0000 UTC m=+1294.309734986" Mar 21 04:44:50 crc kubenswrapper[4839]: I0321 04:44:50.502403 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d9cf4c794-jb7lf"] Mar 21 04:44:50 crc kubenswrapper[4839]: I0321 04:44:50.920548 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d9cf4c794-jb7lf" event={"ID":"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9","Type":"ContainerStarted","Data":"d854a871bf480c473cddf8b21d5c13c7c57ca42f1882e6429c31909d901bd04a"} Mar 21 04:44:50 crc kubenswrapper[4839]: I0321 04:44:50.921179 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-59cc78c49d-pkvb5" podUID="08c48ff3-782c-4f5a-8a20-0736565e247a" containerName="barbican-api" containerID="cri-o://3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b" gracePeriod=30 Mar 21 04:44:50 crc kubenswrapper[4839]: I0321 04:44:50.921140 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-59cc78c49d-pkvb5" podUID="08c48ff3-782c-4f5a-8a20-0736565e247a" containerName="barbican-api-log" containerID="cri-o://006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9" gracePeriod=30 Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.599321 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.691610 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c48ff3-782c-4f5a-8a20-0736565e247a-logs\") pod \"08c48ff3-782c-4f5a-8a20-0736565e247a\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.691669 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-combined-ca-bundle\") pod \"08c48ff3-782c-4f5a-8a20-0736565e247a\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.691730 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data\") pod \"08c48ff3-782c-4f5a-8a20-0736565e247a\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.691781 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data-custom\") pod \"08c48ff3-782c-4f5a-8a20-0736565e247a\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.691841 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsv2l\" (UniqueName: \"kubernetes.io/projected/08c48ff3-782c-4f5a-8a20-0736565e247a-kube-api-access-lsv2l\") pod \"08c48ff3-782c-4f5a-8a20-0736565e247a\" (UID: \"08c48ff3-782c-4f5a-8a20-0736565e247a\") " Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.692036 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c48ff3-782c-4f5a-8a20-0736565e247a-logs" (OuterVolumeSpecName: "logs") pod "08c48ff3-782c-4f5a-8a20-0736565e247a" (UID: "08c48ff3-782c-4f5a-8a20-0736565e247a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.692468 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08c48ff3-782c-4f5a-8a20-0736565e247a-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.695613 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c48ff3-782c-4f5a-8a20-0736565e247a-kube-api-access-lsv2l" (OuterVolumeSpecName: "kube-api-access-lsv2l") pod "08c48ff3-782c-4f5a-8a20-0736565e247a" (UID: "08c48ff3-782c-4f5a-8a20-0736565e247a"). InnerVolumeSpecName "kube-api-access-lsv2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.697687 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "08c48ff3-782c-4f5a-8a20-0736565e247a" (UID: "08c48ff3-782c-4f5a-8a20-0736565e247a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.733722 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08c48ff3-782c-4f5a-8a20-0736565e247a" (UID: "08c48ff3-782c-4f5a-8a20-0736565e247a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.793754 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.793780 4839 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.793788 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsv2l\" (UniqueName: \"kubernetes.io/projected/08c48ff3-782c-4f5a-8a20-0736565e247a-kube-api-access-lsv2l\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.810594 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data" (OuterVolumeSpecName: "config-data") pod "08c48ff3-782c-4f5a-8a20-0736565e247a" (UID: "08c48ff3-782c-4f5a-8a20-0736565e247a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.904045 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c48ff3-782c-4f5a-8a20-0736565e247a-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.934997 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77466dd775-brs5x" event={"ID":"392fb516-8745-40fb-b38d-53106c8310df","Type":"ContainerStarted","Data":"ae270d4db24cad72c19c535910d2db30e454a5a89554335e5a48e6588c21e3f2"} Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.939401 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" event={"ID":"e5f64e49-61a6-4601-b37b-f9af6079108c","Type":"ContainerStarted","Data":"d2facd4bdb131b2c8282fb6aaa7c8482ec973fccdc0e979a7ef3bb2d591fc401"} Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.939444 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" event={"ID":"e5f64e49-61a6-4601-b37b-f9af6079108c","Type":"ContainerStarted","Data":"a4599ccee0ab959c4e13333beacaf8a8188b2b3effd5f393fa70626300ad559e"} Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.950377 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" event={"ID":"e6e03301-fb6e-467b-b19d-21b5c475d35c","Type":"ContainerStarted","Data":"0109ad2133b2847c0789961fe4df245b3330bac2bdaa17fccfe8427de3b0f70a"} Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.950428 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" event={"ID":"e6e03301-fb6e-467b-b19d-21b5c475d35c","Type":"ContainerStarted","Data":"225ca07006eea0413be478bddfc73f7cf7ca14abc2e370b12af874b935b50e00"} Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.962996 4839 generic.go:334] "Generic (PLEG): container finished" podID="08c48ff3-782c-4f5a-8a20-0736565e247a" containerID="3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b" exitCode=0 Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.963029 4839 generic.go:334] "Generic (PLEG): container finished" podID="08c48ff3-782c-4f5a-8a20-0736565e247a" containerID="006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9" exitCode=143 Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.963073 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59cc78c49d-pkvb5" event={"ID":"08c48ff3-782c-4f5a-8a20-0736565e247a","Type":"ContainerDied","Data":"3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b"} Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.963100 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59cc78c49d-pkvb5" event={"ID":"08c48ff3-782c-4f5a-8a20-0736565e247a","Type":"ContainerDied","Data":"006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9"} Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.963110 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59cc78c49d-pkvb5" event={"ID":"08c48ff3-782c-4f5a-8a20-0736565e247a","Type":"ContainerDied","Data":"75d6eae50b83f1f8ec52b3d138f4404879d9130c0a03aa11c82efdf2866d9d7d"} Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.963128 4839 scope.go:117] "RemoveContainer" containerID="3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.963228 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59cc78c49d-pkvb5" Mar 21 04:44:51 crc kubenswrapper[4839]: I0321 04:44:51.973317 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d9cf4c794-jb7lf" event={"ID":"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9","Type":"ContainerStarted","Data":"a6b1b6f8886247bae06bf792f12be02011d856b581bc06b70f970fa3ce5a8ba9"} Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.001436 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-db77b8b5f-grbp8" event={"ID":"3563c0f9-9e82-4798-bae3-b3836a6b5866","Type":"ContainerStarted","Data":"8bd67185751e9518d82306412b1a995f91b84a70526a6d67cfc3e74508e78318"} Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.003687 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" podStartSLOduration=3.417777461 podStartE2EDuration="7.003662418s" podCreationTimestamp="2026-03-21 04:44:45 +0000 UTC" firstStartedPulling="2026-03-21 04:44:47.403667611 +0000 UTC m=+1291.731454287" lastFinishedPulling="2026-03-21 04:44:50.989552568 +0000 UTC m=+1295.317339244" observedRunningTime="2026-03-21 04:44:51.971173609 +0000 UTC m=+1296.298960285" watchObservedRunningTime="2026-03-21 04:44:52.003662418 +0000 UTC m=+1296.331449094" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.005185 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7b946d96f4-chv76" podStartSLOduration=3.090337297 podStartE2EDuration="6.00517622s" podCreationTimestamp="2026-03-21 04:44:46 +0000 UTC" firstStartedPulling="2026-03-21 04:44:48.080322031 +0000 UTC m=+1292.408108707" lastFinishedPulling="2026-03-21 04:44:50.995160954 +0000 UTC m=+1295.322947630" observedRunningTime="2026-03-21 04:44:51.996777125 +0000 UTC m=+1296.324563801" watchObservedRunningTime="2026-03-21 04:44:52.00517622 +0000 UTC m=+1296.332962896" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.027493 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-db77b8b5f-grbp8" podStartSLOduration=3.060209184 podStartE2EDuration="6.027479244s" podCreationTimestamp="2026-03-21 04:44:46 +0000 UTC" firstStartedPulling="2026-03-21 04:44:48.02271675 +0000 UTC m=+1292.350503436" lastFinishedPulling="2026-03-21 04:44:50.98998682 +0000 UTC m=+1295.317773496" observedRunningTime="2026-03-21 04:44:52.024405978 +0000 UTC m=+1296.352192654" watchObservedRunningTime="2026-03-21 04:44:52.027479244 +0000 UTC m=+1296.355265920" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.059822 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-56b894998b-l59vx"] Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.059830 4839 scope.go:117] "RemoveContainer" containerID="006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.072692 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-59cc78c49d-pkvb5"] Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.092553 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-59cc78c49d-pkvb5"] Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.098688 4839 scope.go:117] "RemoveContainer" containerID="3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.103444 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-77466dd775-brs5x"] Mar 21 04:44:52 crc kubenswrapper[4839]: E0321 04:44:52.103725 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b\": container with ID starting with 3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b not found: ID does not exist" containerID="3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.103819 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b"} err="failed to get container status \"3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b\": rpc error: code = NotFound desc = could not find container \"3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b\": container with ID starting with 3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b not found: ID does not exist" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.103910 4839 scope.go:117] "RemoveContainer" containerID="006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9" Mar 21 04:44:52 crc kubenswrapper[4839]: E0321 04:44:52.106777 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9\": container with ID starting with 006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9 not found: ID does not exist" containerID="006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.106814 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9"} err="failed to get container status \"006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9\": rpc error: code = NotFound desc = could not find container \"006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9\": container with ID starting with 006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9 not found: ID does not exist" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.106840 4839 scope.go:117] "RemoveContainer" containerID="3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.107085 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b"} err="failed to get container status \"3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b\": rpc error: code = NotFound desc = could not find container \"3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b\": container with ID starting with 3c96822ce29fd36df89bd69bb9910a63e4f7280b5ee31572e7428e248b386c7b not found: ID does not exist" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.107104 4839 scope.go:117] "RemoveContainer" containerID="006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.109386 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9"} err="failed to get container status \"006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9\": rpc error: code = NotFound desc = could not find container \"006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9\": container with ID starting with 006498edc7390e4a3c12026f57aa6ecaf4e9b90cf55099e0f8f58424b7b384f9 not found: ID does not exist" Mar 21 04:44:52 crc kubenswrapper[4839]: I0321 04:44:52.499640 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08c48ff3-782c-4f5a-8a20-0736565e247a" path="/var/lib/kubelet/pods/08c48ff3-782c-4f5a-8a20-0736565e247a/volumes" Mar 21 04:44:53 crc kubenswrapper[4839]: I0321 04:44:53.018869 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77466dd775-brs5x" event={"ID":"392fb516-8745-40fb-b38d-53106c8310df","Type":"ContainerStarted","Data":"fd761f8a43876dc94432e55d47974d0e86c8a21f74c77ba35b0fda3b1a64872e"} Mar 21 04:44:53 crc kubenswrapper[4839]: I0321 04:44:53.031998 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d9cf4c794-jb7lf" event={"ID":"37ba14c5-dfc7-4268-86c9-c0efe37fe6c9","Type":"ContainerStarted","Data":"ead5ea8cce6422b9395b5e4042e0f142a18f226fac54adc05d9902b6f49bcd2b"} Mar 21 04:44:53 crc kubenswrapper[4839]: I0321 04:44:53.034849 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:53 crc kubenswrapper[4839]: I0321 04:44:53.034891 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:44:53 crc kubenswrapper[4839]: I0321 04:44:53.042217 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-db77b8b5f-grbp8" event={"ID":"3563c0f9-9e82-4798-bae3-b3836a6b5866","Type":"ContainerStarted","Data":"9d92b450d693bd25d9bf1a793ee362f3b967d2ff5568fa333eb1ee5c645df9e7"} Mar 21 04:44:53 crc kubenswrapper[4839]: I0321 04:44:53.042707 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-77466dd775-brs5x" podStartSLOduration=4.483561038 podStartE2EDuration="8.042684805s" podCreationTimestamp="2026-03-21 04:44:45 +0000 UTC" firstStartedPulling="2026-03-21 04:44:47.433929488 +0000 UTC m=+1291.761716164" lastFinishedPulling="2026-03-21 04:44:50.993053255 +0000 UTC m=+1295.320839931" observedRunningTime="2026-03-21 04:44:53.039715602 +0000 UTC m=+1297.367502278" watchObservedRunningTime="2026-03-21 04:44:53.042684805 +0000 UTC m=+1297.370471481" Mar 21 04:44:53 crc kubenswrapper[4839]: I0321 04:44:53.074004 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-d9cf4c794-jb7lf" podStartSLOduration=4.073986201 podStartE2EDuration="4.073986201s" podCreationTimestamp="2026-03-21 04:44:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:44:53.062122779 +0000 UTC m=+1297.389909465" watchObservedRunningTime="2026-03-21 04:44:53.073986201 +0000 UTC m=+1297.401772877" Mar 21 04:44:54 crc kubenswrapper[4839]: I0321 04:44:54.050483 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-77466dd775-brs5x" podUID="392fb516-8745-40fb-b38d-53106c8310df" containerName="barbican-worker-log" containerID="cri-o://ae270d4db24cad72c19c535910d2db30e454a5a89554335e5a48e6588c21e3f2" gracePeriod=30 Mar 21 04:44:54 crc kubenswrapper[4839]: I0321 04:44:54.050530 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-77466dd775-brs5x" podUID="392fb516-8745-40fb-b38d-53106c8310df" containerName="barbican-worker" containerID="cri-o://fd761f8a43876dc94432e55d47974d0e86c8a21f74c77ba35b0fda3b1a64872e" gracePeriod=30 Mar 21 04:44:54 crc kubenswrapper[4839]: I0321 04:44:54.051854 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" podUID="e5f64e49-61a6-4601-b37b-f9af6079108c" containerName="barbican-keystone-listener-log" containerID="cri-o://a4599ccee0ab959c4e13333beacaf8a8188b2b3effd5f393fa70626300ad559e" gracePeriod=30 Mar 21 04:44:54 crc kubenswrapper[4839]: I0321 04:44:54.051960 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" podUID="e5f64e49-61a6-4601-b37b-f9af6079108c" containerName="barbican-keystone-listener" containerID="cri-o://d2facd4bdb131b2c8282fb6aaa7c8482ec973fccdc0e979a7ef3bb2d591fc401" gracePeriod=30 Mar 21 04:44:55 crc kubenswrapper[4839]: I0321 04:44:55.060554 4839 generic.go:334] "Generic (PLEG): container finished" podID="392fb516-8745-40fb-b38d-53106c8310df" containerID="ae270d4db24cad72c19c535910d2db30e454a5a89554335e5a48e6588c21e3f2" exitCode=143 Mar 21 04:44:55 crc kubenswrapper[4839]: I0321 04:44:55.060604 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77466dd775-brs5x" event={"ID":"392fb516-8745-40fb-b38d-53106c8310df","Type":"ContainerDied","Data":"ae270d4db24cad72c19c535910d2db30e454a5a89554335e5a48e6588c21e3f2"} Mar 21 04:44:55 crc kubenswrapper[4839]: I0321 04:44:55.066405 4839 generic.go:334] "Generic (PLEG): container finished" podID="e5f64e49-61a6-4601-b37b-f9af6079108c" containerID="d2facd4bdb131b2c8282fb6aaa7c8482ec973fccdc0e979a7ef3bb2d591fc401" exitCode=0 Mar 21 04:44:55 crc kubenswrapper[4839]: I0321 04:44:55.066433 4839 generic.go:334] "Generic (PLEG): container finished" podID="e5f64e49-61a6-4601-b37b-f9af6079108c" containerID="a4599ccee0ab959c4e13333beacaf8a8188b2b3effd5f393fa70626300ad559e" exitCode=143 Mar 21 04:44:55 crc kubenswrapper[4839]: I0321 04:44:55.066501 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" event={"ID":"e5f64e49-61a6-4601-b37b-f9af6079108c","Type":"ContainerDied","Data":"d2facd4bdb131b2c8282fb6aaa7c8482ec973fccdc0e979a7ef3bb2d591fc401"} Mar 21 04:44:55 crc kubenswrapper[4839]: I0321 04:44:55.066560 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" event={"ID":"e5f64e49-61a6-4601-b37b-f9af6079108c","Type":"ContainerDied","Data":"a4599ccee0ab959c4e13333beacaf8a8188b2b3effd5f393fa70626300ad559e"} Mar 21 04:44:55 crc kubenswrapper[4839]: I0321 04:44:55.984532 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 21 04:44:55 crc kubenswrapper[4839]: I0321 04:44:55.985442 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.027921 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.039726 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.083108 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.083147 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.183431 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.183476 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.221716 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.493689 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.554831 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-w74nb"] Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.555062 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" podUID="aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" containerName="dnsmasq-dns" containerID="cri-o://715b056e0e8951dcb0bce46eff3f4cc77b23970621f830f7f64bde0192431e68" gracePeriod=10 Mar 21 04:44:56 crc kubenswrapper[4839]: I0321 04:44:56.736242 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:57 crc kubenswrapper[4839]: I0321 04:44:57.091787 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:57 crc kubenswrapper[4839]: I0321 04:44:57.092202 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:57 crc kubenswrapper[4839]: I0321 04:44:57.134837 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-67dd687666-pgfc5" podUID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 21 04:44:58 crc kubenswrapper[4839]: I0321 04:44:58.209327 4839 generic.go:334] "Generic (PLEG): container finished" podID="aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" containerID="715b056e0e8951dcb0bce46eff3f4cc77b23970621f830f7f64bde0192431e68" exitCode=0 Mar 21 04:44:58 crc kubenswrapper[4839]: I0321 04:44:58.210367 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" event={"ID":"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15","Type":"ContainerDied","Data":"715b056e0e8951dcb0bce46eff3f4cc77b23970621f830f7f64bde0192431e68"} Mar 21 04:44:58 crc kubenswrapper[4839]: I0321 04:44:58.859028 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:58 crc kubenswrapper[4839]: I0321 04:44:58.987814 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:44:59 crc kubenswrapper[4839]: I0321 04:44:59.209293 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" podUID="aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.157:5353: connect: connection refused" Mar 21 04:44:59 crc kubenswrapper[4839]: I0321 04:44:59.259813 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 21 04:44:59 crc kubenswrapper[4839]: I0321 04:44:59.259935 4839 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 04:44:59 crc kubenswrapper[4839]: I0321 04:44:59.262705 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-84c6c985f8-v5cmh" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 21 04:44:59 crc kubenswrapper[4839]: I0321 04:44:59.292826 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 21 04:44:59 crc kubenswrapper[4839]: I0321 04:44:59.329327 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-9c97f4dbd-k2scs" podUID="579308eb-854d-4160-ad35-8677f2d0e634" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 21 04:44:59 crc kubenswrapper[4839]: I0321 04:44:59.697863 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 21 04:44:59 crc kubenswrapper[4839]: I0321 04:44:59.698215 4839 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 04:44:59 crc kubenswrapper[4839]: I0321 04:44:59.754743 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.151602 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j"] Mar 21 04:45:00 crc kubenswrapper[4839]: E0321 04:45:00.152050 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c48ff3-782c-4f5a-8a20-0736565e247a" containerName="barbican-api-log" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.152071 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c48ff3-782c-4f5a-8a20-0736565e247a" containerName="barbican-api-log" Mar 21 04:45:00 crc kubenswrapper[4839]: E0321 04:45:00.152111 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c48ff3-782c-4f5a-8a20-0736565e247a" containerName="barbican-api" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.152119 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c48ff3-782c-4f5a-8a20-0736565e247a" containerName="barbican-api" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.152317 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c48ff3-782c-4f5a-8a20-0736565e247a" containerName="barbican-api" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.152358 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c48ff3-782c-4f5a-8a20-0736565e247a" containerName="barbican-api-log" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.153079 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.155847 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.155848 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.172091 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j"] Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.213246 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-secret-volume\") pod \"collect-profiles-29567805-9sr8j\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.213377 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rsc2\" (UniqueName: \"kubernetes.io/projected/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-kube-api-access-6rsc2\") pod \"collect-profiles-29567805-9sr8j\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.213411 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-config-volume\") pod \"collect-profiles-29567805-9sr8j\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.315232 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rsc2\" (UniqueName: \"kubernetes.io/projected/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-kube-api-access-6rsc2\") pod \"collect-profiles-29567805-9sr8j\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.315304 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-config-volume\") pod \"collect-profiles-29567805-9sr8j\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.315592 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-secret-volume\") pod \"collect-profiles-29567805-9sr8j\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.317014 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-config-volume\") pod \"collect-profiles-29567805-9sr8j\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.364048 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-secret-volume\") pod \"collect-profiles-29567805-9sr8j\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.370272 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rsc2\" (UniqueName: \"kubernetes.io/projected/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-kube-api-access-6rsc2\") pod \"collect-profiles-29567805-9sr8j\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.471583 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.980379 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:45:00 crc kubenswrapper[4839]: I0321 04:45:00.980741 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:45:01 crc kubenswrapper[4839]: I0321 04:45:01.243859 4839 generic.go:334] "Generic (PLEG): container finished" podID="392fb516-8745-40fb-b38d-53106c8310df" containerID="fd761f8a43876dc94432e55d47974d0e86c8a21f74c77ba35b0fda3b1a64872e" exitCode=0 Mar 21 04:45:01 crc kubenswrapper[4839]: I0321 04:45:01.243927 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77466dd775-brs5x" event={"ID":"392fb516-8745-40fb-b38d-53106c8310df","Type":"ContainerDied","Data":"fd761f8a43876dc94432e55d47974d0e86c8a21f74c77ba35b0fda3b1a64872e"} Mar 21 04:45:02 crc kubenswrapper[4839]: I0321 04:45:02.073812 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:45:02 crc kubenswrapper[4839]: I0321 04:45:02.483101 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d9cf4c794-jb7lf" Mar 21 04:45:02 crc kubenswrapper[4839]: I0321 04:45:02.585758 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-67dd687666-pgfc5"] Mar 21 04:45:02 crc kubenswrapper[4839]: I0321 04:45:02.585993 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-67dd687666-pgfc5" podUID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerName="barbican-api-log" containerID="cri-o://5b137b53eba217c749f810e3fe6d4536182b4cec7923324d43b649cbc888ca03" gracePeriod=30 Mar 21 04:45:02 crc kubenswrapper[4839]: I0321 04:45:02.586536 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-67dd687666-pgfc5" podUID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerName="barbican-api" containerID="cri-o://77ad1711dc27b34bdfe011c55c79bdc7d6ef8a5e0c42e7951254c65ef19efa51" gracePeriod=30 Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.264617 4839 generic.go:334] "Generic (PLEG): container finished" podID="6000d2d4-e84a-443f-9094-ab999541331d" containerID="89d53502805454e28eeac8a6f5794fb2c1a2eba3acba95c45fd4f0d839ae56ac" exitCode=0 Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.264915 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qfjms" event={"ID":"6000d2d4-e84a-443f-9094-ab999541331d","Type":"ContainerDied","Data":"89d53502805454e28eeac8a6f5794fb2c1a2eba3acba95c45fd4f0d839ae56ac"} Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.273852 4839 generic.go:334] "Generic (PLEG): container finished" podID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerID="5b137b53eba217c749f810e3fe6d4536182b4cec7923324d43b649cbc888ca03" exitCode=143 Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.273919 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67dd687666-pgfc5" event={"ID":"10c8735c-f1c9-40f7-bd34-60bb0749bc23","Type":"ContainerDied","Data":"5b137b53eba217c749f810e3fe6d4536182b4cec7923324d43b649cbc888ca03"} Mar 21 04:45:03 crc kubenswrapper[4839]: E0321 04:45:03.592277 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Mar 21 04:45:03 crc kubenswrapper[4839]: E0321 04:45:03.592719 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5zddh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(6c266726-5bfd-4519-bdd5-9db7f6a77df4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 21 04:45:03 crc kubenswrapper[4839]: E0321 04:45:03.594145 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.729692 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.745070 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.757202 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.806034 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-svc\") pod \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.806105 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-combined-ca-bundle\") pod \"392fb516-8745-40fb-b38d-53106c8310df\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.807439 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data-custom\") pod \"392fb516-8745-40fb-b38d-53106c8310df\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.807511 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n782h\" (UniqueName: \"kubernetes.io/projected/e5f64e49-61a6-4601-b37b-f9af6079108c-kube-api-access-n782h\") pod \"e5f64e49-61a6-4601-b37b-f9af6079108c\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.807533 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data\") pod \"e5f64e49-61a6-4601-b37b-f9af6079108c\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.807562 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data-custom\") pod \"e5f64e49-61a6-4601-b37b-f9af6079108c\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.808544 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-sb\") pod \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.808574 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/392fb516-8745-40fb-b38d-53106c8310df-logs\") pod \"392fb516-8745-40fb-b38d-53106c8310df\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.809247 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data\") pod \"392fb516-8745-40fb-b38d-53106c8310df\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.809275 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-swift-storage-0\") pod \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.809310 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-nb\") pod \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.809406 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-config\") pod \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.809464 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-combined-ca-bundle\") pod \"e5f64e49-61a6-4601-b37b-f9af6079108c\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.809493 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5f64e49-61a6-4601-b37b-f9af6079108c-logs\") pod \"e5f64e49-61a6-4601-b37b-f9af6079108c\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.809529 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf4cm\" (UniqueName: \"kubernetes.io/projected/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-kube-api-access-sf4cm\") pod \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\" (UID: \"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.809554 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmt9m\" (UniqueName: \"kubernetes.io/projected/392fb516-8745-40fb-b38d-53106c8310df-kube-api-access-xmt9m\") pod \"392fb516-8745-40fb-b38d-53106c8310df\" (UID: \"392fb516-8745-40fb-b38d-53106c8310df\") " Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.819125 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/392fb516-8745-40fb-b38d-53106c8310df-logs" (OuterVolumeSpecName: "logs") pod "392fb516-8745-40fb-b38d-53106c8310df" (UID: "392fb516-8745-40fb-b38d-53106c8310df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.821962 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "392fb516-8745-40fb-b38d-53106c8310df" (UID: "392fb516-8745-40fb-b38d-53106c8310df"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.822142 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e5f64e49-61a6-4601-b37b-f9af6079108c" (UID: "e5f64e49-61a6-4601-b37b-f9af6079108c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.822528 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5f64e49-61a6-4601-b37b-f9af6079108c-logs" (OuterVolumeSpecName: "logs") pod "e5f64e49-61a6-4601-b37b-f9af6079108c" (UID: "e5f64e49-61a6-4601-b37b-f9af6079108c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.822970 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/392fb516-8745-40fb-b38d-53106c8310df-kube-api-access-xmt9m" (OuterVolumeSpecName: "kube-api-access-xmt9m") pod "392fb516-8745-40fb-b38d-53106c8310df" (UID: "392fb516-8745-40fb-b38d-53106c8310df"). InnerVolumeSpecName "kube-api-access-xmt9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.843318 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f64e49-61a6-4601-b37b-f9af6079108c-kube-api-access-n782h" (OuterVolumeSpecName: "kube-api-access-n782h") pod "e5f64e49-61a6-4601-b37b-f9af6079108c" (UID: "e5f64e49-61a6-4601-b37b-f9af6079108c"). InnerVolumeSpecName "kube-api-access-n782h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.843872 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-kube-api-access-sf4cm" (OuterVolumeSpecName: "kube-api-access-sf4cm") pod "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" (UID: "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15"). InnerVolumeSpecName "kube-api-access-sf4cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.895157 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "392fb516-8745-40fb-b38d-53106c8310df" (UID: "392fb516-8745-40fb-b38d-53106c8310df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.914936 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmt9m\" (UniqueName: \"kubernetes.io/projected/392fb516-8745-40fb-b38d-53106c8310df-kube-api-access-xmt9m\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.914974 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.914987 4839 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.914997 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n782h\" (UniqueName: \"kubernetes.io/projected/e5f64e49-61a6-4601-b37b-f9af6079108c-kube-api-access-n782h\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.915010 4839 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.915023 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/392fb516-8745-40fb-b38d-53106c8310df-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.915035 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5f64e49-61a6-4601-b37b-f9af6079108c-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.915045 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf4cm\" (UniqueName: \"kubernetes.io/projected/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-kube-api-access-sf4cm\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:03 crc kubenswrapper[4839]: E0321 04:45:03.959314 4839 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-combined-ca-bundle podName:e5f64e49-61a6-4601-b37b-f9af6079108c nodeName:}" failed. No retries permitted until 2026-03-21 04:45:04.45928051 +0000 UTC m=+1308.787067186 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-combined-ca-bundle") pod "e5f64e49-61a6-4601-b37b-f9af6079108c" (UID: "e5f64e49-61a6-4601-b37b-f9af6079108c") : error deleting /var/lib/kubelet/pods/e5f64e49-61a6-4601-b37b-f9af6079108c/volume-subpaths: remove /var/lib/kubelet/pods/e5f64e49-61a6-4601-b37b-f9af6079108c/volume-subpaths: no such file or directory Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.959924 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" (UID: "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.959958 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" (UID: "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.960058 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-config" (OuterVolumeSpecName: "config") pod "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" (UID: "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.962793 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data" (OuterVolumeSpecName: "config-data") pod "e5f64e49-61a6-4601-b37b-f9af6079108c" (UID: "e5f64e49-61a6-4601-b37b-f9af6079108c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.964428 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" (UID: "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.969164 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" (UID: "aeba2ea1-e2d1-46d2-8e89-982cd58f3b15"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:03 crc kubenswrapper[4839]: I0321 04:45:03.989823 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data" (OuterVolumeSpecName: "config-data") pod "392fb516-8745-40fb-b38d-53106c8310df" (UID: "392fb516-8745-40fb-b38d-53106c8310df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.017247 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.017277 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.017286 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.017296 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.017304 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.017313 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392fb516-8745-40fb-b38d-53106c8310df-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.017323 4839 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.188255 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j"] Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.230597 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.293343 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.293334 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-w74nb" event={"ID":"aeba2ea1-e2d1-46d2-8e89-982cd58f3b15","Type":"ContainerDied","Data":"594e402f9f6a6f0c691bdaa0d2b0e852622545812087194fbffc061c3f4fc05b"} Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.293794 4839 scope.go:117] "RemoveContainer" containerID="715b056e0e8951dcb0bce46eff3f4cc77b23970621f830f7f64bde0192431e68" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.302194 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" event={"ID":"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26","Type":"ContainerStarted","Data":"06a1cfd95284ed8daa71d7f99c5f1c2898ca406b61f4a56279166d4934b555c0"} Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.307727 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-77466dd775-brs5x" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.308071 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-77466dd775-brs5x" event={"ID":"392fb516-8745-40fb-b38d-53106c8310df","Type":"ContainerDied","Data":"aaed11a9eedbf42719456ee569b2249eb2ed204275c2e327bc69e0e9ba6d0c2e"} Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.314913 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" event={"ID":"e5f64e49-61a6-4601-b37b-f9af6079108c","Type":"ContainerDied","Data":"5a91be0fc03b74a93a2e4a4a9276bbbc5ad15e2d8798017231a484ecfe7f2ff6"} Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.314972 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-56b894998b-l59vx" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.315726 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" containerName="ceilometer-notification-agent" containerID="cri-o://0a64d9a20f4f5b5d0b9782608a440c655769c9db2754bb98b7278494dc83ae14" gracePeriod=30 Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.315883 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" containerName="sg-core" containerID="cri-o://97884e844baf73e80ed5f7a5c51d988d7d7009365523dceffa2a7bc9d1e19948" gracePeriod=30 Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.340168 4839 scope.go:117] "RemoveContainer" containerID="4d47c811396da6675e98576b8f9d542f9a6e50f5a5df44132f5048a5caae6747" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.346967 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-w74nb"] Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.355132 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-w74nb"] Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.393871 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-77466dd775-brs5x"] Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.402042 4839 scope.go:117] "RemoveContainer" containerID="fd761f8a43876dc94432e55d47974d0e86c8a21f74c77ba35b0fda3b1a64872e" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.402867 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-77466dd775-brs5x"] Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.441351 4839 scope.go:117] "RemoveContainer" containerID="ae270d4db24cad72c19c535910d2db30e454a5a89554335e5a48e6588c21e3f2" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.478661 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="392fb516-8745-40fb-b38d-53106c8310df" path="/var/lib/kubelet/pods/392fb516-8745-40fb-b38d-53106c8310df/volumes" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.479231 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" path="/var/lib/kubelet/pods/aeba2ea1-e2d1-46d2-8e89-982cd58f3b15/volumes" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.486570 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-98964f649-mrjrt"] Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.486803 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-98964f649-mrjrt" podUID="e965d008-890b-408c-a5a8-823aca00140a" containerName="neutron-api" containerID="cri-o://6e416952cf65a99f24d43cb637a81bb2e071806b75507c88029a3d669986edf2" gracePeriod=30 Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.486944 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-98964f649-mrjrt" podUID="e965d008-890b-408c-a5a8-823aca00140a" containerName="neutron-httpd" containerID="cri-o://c52f7b158358ef8b38cfac03210bf15a4ca76a8dbb9c567dbd73763e507062d1" gracePeriod=30 Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.489172 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-748dbf85fc-jslwv"] Mar 21 04:45:04 crc kubenswrapper[4839]: E0321 04:45:04.489505 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392fb516-8745-40fb-b38d-53106c8310df" containerName="barbican-worker" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.489518 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="392fb516-8745-40fb-b38d-53106c8310df" containerName="barbican-worker" Mar 21 04:45:04 crc kubenswrapper[4839]: E0321 04:45:04.489531 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f64e49-61a6-4601-b37b-f9af6079108c" containerName="barbican-keystone-listener" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.489536 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f64e49-61a6-4601-b37b-f9af6079108c" containerName="barbican-keystone-listener" Mar 21 04:45:04 crc kubenswrapper[4839]: E0321 04:45:04.489562 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f64e49-61a6-4601-b37b-f9af6079108c" containerName="barbican-keystone-listener-log" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.489570 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f64e49-61a6-4601-b37b-f9af6079108c" containerName="barbican-keystone-listener-log" Mar 21 04:45:04 crc kubenswrapper[4839]: E0321 04:45:04.489581 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392fb516-8745-40fb-b38d-53106c8310df" containerName="barbican-worker-log" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.489587 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="392fb516-8745-40fb-b38d-53106c8310df" containerName="barbican-worker-log" Mar 21 04:45:04 crc kubenswrapper[4839]: E0321 04:45:04.489614 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" containerName="dnsmasq-dns" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.489621 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" containerName="dnsmasq-dns" Mar 21 04:45:04 crc kubenswrapper[4839]: E0321 04:45:04.489633 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" containerName="init" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.489638 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" containerName="init" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.490088 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f64e49-61a6-4601-b37b-f9af6079108c" containerName="barbican-keystone-listener" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.490110 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="392fb516-8745-40fb-b38d-53106c8310df" containerName="barbican-worker-log" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.490121 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f64e49-61a6-4601-b37b-f9af6079108c" containerName="barbican-keystone-listener-log" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.490132 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeba2ea1-e2d1-46d2-8e89-982cd58f3b15" containerName="dnsmasq-dns" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.490140 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="392fb516-8745-40fb-b38d-53106c8310df" containerName="barbican-worker" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.491659 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.494931 4839 scope.go:117] "RemoveContainer" containerID="d2facd4bdb131b2c8282fb6aaa7c8482ec973fccdc0e979a7ef3bb2d591fc401" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.505855 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-748dbf85fc-jslwv"] Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.532424 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-combined-ca-bundle\") pod \"e5f64e49-61a6-4601-b37b-f9af6079108c\" (UID: \"e5f64e49-61a6-4601-b37b-f9af6079108c\") " Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.532758 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-internal-tls-certs\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.532809 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-combined-ca-bundle\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.532833 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-config\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.532918 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-ovndb-tls-certs\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.532957 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-httpd-config\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.532978 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-public-tls-certs\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.533038 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzlqg\" (UniqueName: \"kubernetes.io/projected/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-kube-api-access-xzlqg\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.537461 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5f64e49-61a6-4601-b37b-f9af6079108c" (UID: "e5f64e49-61a6-4601-b37b-f9af6079108c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.538395 4839 scope.go:117] "RemoveContainer" containerID="a4599ccee0ab959c4e13333beacaf8a8188b2b3effd5f393fa70626300ad559e" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.602513 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-98964f649-mrjrt" podUID="e965d008-890b-408c-a5a8-823aca00140a" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.159:9696/\": read tcp 10.217.0.2:57558->10.217.0.159:9696: read: connection reset by peer" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.634991 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-ovndb-tls-certs\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.635043 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-httpd-config\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.635062 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-public-tls-certs\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.635119 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzlqg\" (UniqueName: \"kubernetes.io/projected/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-kube-api-access-xzlqg\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.635145 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-internal-tls-certs\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.635177 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-combined-ca-bundle\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.635195 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-config\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.635253 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f64e49-61a6-4601-b37b-f9af6079108c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.641184 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-config\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.650200 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-public-tls-certs\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.654513 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-httpd-config\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.656481 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzlqg\" (UniqueName: \"kubernetes.io/projected/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-kube-api-access-xzlqg\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.671950 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-ovndb-tls-certs\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.675856 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-combined-ca-bundle\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.682078 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-56b894998b-l59vx"] Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.689432 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd21ac8b-d3c0-4f0c-9205-d60d55425d8a-internal-tls-certs\") pod \"neutron-748dbf85fc-jslwv\" (UID: \"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a\") " pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.724435 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-56b894998b-l59vx"] Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.820957 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.869953 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qfjms" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.939513 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-db-sync-config-data\") pod \"6000d2d4-e84a-443f-9094-ab999541331d\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.939617 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-scripts\") pod \"6000d2d4-e84a-443f-9094-ab999541331d\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.939648 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6mpt\" (UniqueName: \"kubernetes.io/projected/6000d2d4-e84a-443f-9094-ab999541331d-kube-api-access-g6mpt\") pod \"6000d2d4-e84a-443f-9094-ab999541331d\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.939780 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-combined-ca-bundle\") pod \"6000d2d4-e84a-443f-9094-ab999541331d\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.939824 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-config-data\") pod \"6000d2d4-e84a-443f-9094-ab999541331d\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.939870 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6000d2d4-e84a-443f-9094-ab999541331d-etc-machine-id\") pod \"6000d2d4-e84a-443f-9094-ab999541331d\" (UID: \"6000d2d4-e84a-443f-9094-ab999541331d\") " Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.940254 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6000d2d4-e84a-443f-9094-ab999541331d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6000d2d4-e84a-443f-9094-ab999541331d" (UID: "6000d2d4-e84a-443f-9094-ab999541331d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.944965 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-scripts" (OuterVolumeSpecName: "scripts") pod "6000d2d4-e84a-443f-9094-ab999541331d" (UID: "6000d2d4-e84a-443f-9094-ab999541331d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.951611 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6000d2d4-e84a-443f-9094-ab999541331d-kube-api-access-g6mpt" (OuterVolumeSpecName: "kube-api-access-g6mpt") pod "6000d2d4-e84a-443f-9094-ab999541331d" (UID: "6000d2d4-e84a-443f-9094-ab999541331d"). InnerVolumeSpecName "kube-api-access-g6mpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.951619 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6000d2d4-e84a-443f-9094-ab999541331d" (UID: "6000d2d4-e84a-443f-9094-ab999541331d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.969208 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6000d2d4-e84a-443f-9094-ab999541331d" (UID: "6000d2d4-e84a-443f-9094-ab999541331d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:04 crc kubenswrapper[4839]: I0321 04:45:04.996689 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-config-data" (OuterVolumeSpecName: "config-data") pod "6000d2d4-e84a-443f-9094-ab999541331d" (UID: "6000d2d4-e84a-443f-9094-ab999541331d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.041828 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.041863 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6mpt\" (UniqueName: \"kubernetes.io/projected/6000d2d4-e84a-443f-9094-ab999541331d-kube-api-access-g6mpt\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.041902 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.041919 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.041930 4839 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6000d2d4-e84a-443f-9094-ab999541331d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.041939 4839 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6000d2d4-e84a-443f-9094-ab999541331d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.325707 4839 generic.go:334] "Generic (PLEG): container finished" podID="e965d008-890b-408c-a5a8-823aca00140a" containerID="c52f7b158358ef8b38cfac03210bf15a4ca76a8dbb9c567dbd73763e507062d1" exitCode=0 Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.325794 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98964f649-mrjrt" event={"ID":"e965d008-890b-408c-a5a8-823aca00140a","Type":"ContainerDied","Data":"c52f7b158358ef8b38cfac03210bf15a4ca76a8dbb9c567dbd73763e507062d1"} Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.329083 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qfjms" event={"ID":"6000d2d4-e84a-443f-9094-ab999541331d","Type":"ContainerDied","Data":"4d5cb1d53067040b399cf367f961d70a4e98d3e793e42e6da997085ddb0d9688"} Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.329113 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d5cb1d53067040b399cf367f961d70a4e98d3e793e42e6da997085ddb0d9688" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.329162 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qfjms" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.332178 4839 generic.go:334] "Generic (PLEG): container finished" podID="47d5a79e-3e14-4d49-bed4-a9c49e7b7f26" containerID="77bf1caf6b0a8e86542e0854eb602cb5e02b5990b61a512100dca57b8da7f1d1" exitCode=0 Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.332249 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" event={"ID":"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26","Type":"ContainerDied","Data":"77bf1caf6b0a8e86542e0854eb602cb5e02b5990b61a512100dca57b8da7f1d1"} Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.334407 4839 generic.go:334] "Generic (PLEG): container finished" podID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" containerID="97884e844baf73e80ed5f7a5c51d988d7d7009365523dceffa2a7bc9d1e19948" exitCode=2 Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.334478 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c266726-5bfd-4519-bdd5-9db7f6a77df4","Type":"ContainerDied","Data":"97884e844baf73e80ed5f7a5c51d988d7d7009365523dceffa2a7bc9d1e19948"} Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.370953 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-748dbf85fc-jslwv"] Mar 21 04:45:05 crc kubenswrapper[4839]: W0321 04:45:05.375868 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd21ac8b_d3c0_4f0c_9205_d60d55425d8a.slice/crio-ac9ca763b0341f17070dbd9db2d04ffcd513f8092500c1d0f5359ce9caa74b46 WatchSource:0}: Error finding container ac9ca763b0341f17070dbd9db2d04ffcd513f8092500c1d0f5359ce9caa74b46: Status 404 returned error can't find the container with id ac9ca763b0341f17070dbd9db2d04ffcd513f8092500c1d0f5359ce9caa74b46 Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.636538 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-qc28r"] Mar 21 04:45:05 crc kubenswrapper[4839]: E0321 04:45:05.636982 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6000d2d4-e84a-443f-9094-ab999541331d" containerName="cinder-db-sync" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.636996 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6000d2d4-e84a-443f-9094-ab999541331d" containerName="cinder-db-sync" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.637174 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6000d2d4-e84a-443f-9094-ab999541331d" containerName="cinder-db-sync" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.638169 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.684077 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-qc28r"] Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.731000 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.732871 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.740083 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.740472 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-4bb6p" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.740671 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.740863 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.758636 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.758688 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wj7c\" (UniqueName: \"kubernetes.io/projected/439bd408-2f5c-45cc-a2f7-8166a4a279c2-kube-api-access-4wj7c\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.758756 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-config\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.758796 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-svc\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.758846 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.758893 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.768655 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860461 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82b135c8-5fc8-4930-9577-1dd9181a1dae-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860540 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860566 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wj7c\" (UniqueName: \"kubernetes.io/projected/439bd408-2f5c-45cc-a2f7-8166a4a279c2-kube-api-access-4wj7c\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860636 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-scripts\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860657 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860710 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nhfj\" (UniqueName: \"kubernetes.io/projected/82b135c8-5fc8-4930-9577-1dd9181a1dae-kube-api-access-9nhfj\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860737 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-config\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860767 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860802 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-svc\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860855 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860893 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.860926 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.861795 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.861906 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-svc\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.862144 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-config\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.863661 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.864517 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.887170 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wj7c\" (UniqueName: \"kubernetes.io/projected/439bd408-2f5c-45cc-a2f7-8166a4a279c2-kube-api-access-4wj7c\") pod \"dnsmasq-dns-6578955fd5-qc28r\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.949456 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.951273 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.954034 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.958716 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.964691 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82b135c8-5fc8-4930-9577-1dd9181a1dae-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.964760 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-scripts\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.964786 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.964817 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82b135c8-5fc8-4930-9577-1dd9181a1dae-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.964837 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nhfj\" (UniqueName: \"kubernetes.io/projected/82b135c8-5fc8-4930-9577-1dd9181a1dae-kube-api-access-9nhfj\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.964880 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.964958 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.968559 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-scripts\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.970513 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.988210 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.989131 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:05 crc kubenswrapper[4839]: I0321 04:45:05.989805 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.003382 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nhfj\" (UniqueName: \"kubernetes.io/projected/82b135c8-5fc8-4930-9577-1dd9181a1dae-kube-api-access-9nhfj\") pod \"cinder-scheduler-0\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.067310 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63efa50f-a0e7-4912-bbd8-c610daf572fd-logs\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.067381 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63efa50f-a0e7-4912-bbd8-c610daf572fd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.067439 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.067478 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data-custom\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.067517 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-scripts\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.067540 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmpmm\" (UniqueName: \"kubernetes.io/projected/63efa50f-a0e7-4912-bbd8-c610daf572fd-kube-api-access-kmpmm\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.067695 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.073838 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.169039 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63efa50f-a0e7-4912-bbd8-c610daf572fd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.169425 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.169479 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data-custom\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.169520 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-scripts\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.169540 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmpmm\" (UniqueName: \"kubernetes.io/projected/63efa50f-a0e7-4912-bbd8-c610daf572fd-kube-api-access-kmpmm\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.169690 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.169771 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63efa50f-a0e7-4912-bbd8-c610daf572fd-logs\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.170243 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63efa50f-a0e7-4912-bbd8-c610daf572fd-logs\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.170306 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63efa50f-a0e7-4912-bbd8-c610daf572fd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.176440 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.176843 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-scripts\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.177438 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.177655 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data-custom\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.200462 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmpmm\" (UniqueName: \"kubernetes.io/projected/63efa50f-a0e7-4912-bbd8-c610daf572fd-kube-api-access-kmpmm\") pod \"cinder-api-0\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.313916 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.358311 4839 generic.go:334] "Generic (PLEG): container finished" podID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerID="77ad1711dc27b34bdfe011c55c79bdc7d6ef8a5e0c42e7951254c65ef19efa51" exitCode=0 Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.358374 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67dd687666-pgfc5" event={"ID":"10c8735c-f1c9-40f7-bd34-60bb0749bc23","Type":"ContainerDied","Data":"77ad1711dc27b34bdfe011c55c79bdc7d6ef8a5e0c42e7951254c65ef19efa51"} Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.363489 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-748dbf85fc-jslwv" event={"ID":"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a","Type":"ContainerStarted","Data":"3b862c544d0747d8c936d406da9f0474ff7e8a90021f0f455055aabe5b622b8f"} Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.364259 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-748dbf85fc-jslwv" event={"ID":"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a","Type":"ContainerStarted","Data":"a7ca8e601071db58f4f665717965ec3f7697dafded6b1d08bad598e9d13b62ad"} Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.364342 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-748dbf85fc-jslwv" event={"ID":"cd21ac8b-d3c0-4f0c-9205-d60d55425d8a","Type":"ContainerStarted","Data":"ac9ca763b0341f17070dbd9db2d04ffcd513f8092500c1d0f5359ce9caa74b46"} Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.364499 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.411679 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-748dbf85fc-jslwv" podStartSLOduration=2.411657206 podStartE2EDuration="2.411657206s" podCreationTimestamp="2026-03-21 04:45:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:06.384198788 +0000 UTC m=+1310.711985464" watchObservedRunningTime="2026-03-21 04:45:06.411657206 +0000 UTC m=+1310.739443882" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.476331 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f64e49-61a6-4601-b37b-f9af6079108c" path="/var/lib/kubelet/pods/e5f64e49-61a6-4601-b37b-f9af6079108c/volumes" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.531547 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-qc28r"] Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.684320 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.810228 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.897920 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.905743 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-98964f649-mrjrt" podUID="e965d008-890b-408c-a5a8-823aca00140a" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.159:9696/\": dial tcp 10.217.0.159:9696: connect: connection refused" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.906298 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj9bx\" (UniqueName: \"kubernetes.io/projected/10c8735c-f1c9-40f7-bd34-60bb0749bc23-kube-api-access-pj9bx\") pod \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.906416 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data-custom\") pod \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.906465 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c8735c-f1c9-40f7-bd34-60bb0749bc23-logs\") pod \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.906522 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data\") pod \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.906569 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-combined-ca-bundle\") pod \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\" (UID: \"10c8735c-f1c9-40f7-bd34-60bb0749bc23\") " Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.907135 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10c8735c-f1c9-40f7-bd34-60bb0749bc23-logs" (OuterVolumeSpecName: "logs") pod "10c8735c-f1c9-40f7-bd34-60bb0749bc23" (UID: "10c8735c-f1c9-40f7-bd34-60bb0749bc23"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.907539 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c8735c-f1c9-40f7-bd34-60bb0749bc23-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.913617 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "10c8735c-f1c9-40f7-bd34-60bb0749bc23" (UID: "10c8735c-f1c9-40f7-bd34-60bb0749bc23"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.914060 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c8735c-f1c9-40f7-bd34-60bb0749bc23-kube-api-access-pj9bx" (OuterVolumeSpecName: "kube-api-access-pj9bx") pod "10c8735c-f1c9-40f7-bd34-60bb0749bc23" (UID: "10c8735c-f1c9-40f7-bd34-60bb0749bc23"). InnerVolumeSpecName "kube-api-access-pj9bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.937522 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10c8735c-f1c9-40f7-bd34-60bb0749bc23" (UID: "10c8735c-f1c9-40f7-bd34-60bb0749bc23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:06 crc kubenswrapper[4839]: I0321 04:45:06.980219 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data" (OuterVolumeSpecName: "config-data") pod "10c8735c-f1c9-40f7-bd34-60bb0749bc23" (UID: "10c8735c-f1c9-40f7-bd34-60bb0749bc23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.008744 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-secret-volume\") pod \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.008886 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-config-volume\") pod \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.008906 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rsc2\" (UniqueName: \"kubernetes.io/projected/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-kube-api-access-6rsc2\") pod \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\" (UID: \"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26\") " Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.009350 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj9bx\" (UniqueName: \"kubernetes.io/projected/10c8735c-f1c9-40f7-bd34-60bb0749bc23-kube-api-access-pj9bx\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.009364 4839 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.009373 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.009384 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c8735c-f1c9-40f7-bd34-60bb0749bc23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.009561 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-config-volume" (OuterVolumeSpecName: "config-volume") pod "47d5a79e-3e14-4d49-bed4-a9c49e7b7f26" (UID: "47d5a79e-3e14-4d49-bed4-a9c49e7b7f26"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.012694 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "47d5a79e-3e14-4d49-bed4-a9c49e7b7f26" (UID: "47d5a79e-3e14-4d49-bed4-a9c49e7b7f26"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.012928 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-kube-api-access-6rsc2" (OuterVolumeSpecName: "kube-api-access-6rsc2") pod "47d5a79e-3e14-4d49-bed4-a9c49e7b7f26" (UID: "47d5a79e-3e14-4d49-bed4-a9c49e7b7f26"). InnerVolumeSpecName "kube-api-access-6rsc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.095013 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 21 04:45:07 crc kubenswrapper[4839]: W0321 04:45:07.097013 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63efa50f_a0e7_4912_bbd8_c610daf572fd.slice/crio-6c8ae425c983b2b78de1cbadfa55da8fe679ecb65a24d6c38077401c6093e407 WatchSource:0}: Error finding container 6c8ae425c983b2b78de1cbadfa55da8fe679ecb65a24d6c38077401c6093e407: Status 404 returned error can't find the container with id 6c8ae425c983b2b78de1cbadfa55da8fe679ecb65a24d6c38077401c6093e407 Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.111178 4839 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.111210 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rsc2\" (UniqueName: \"kubernetes.io/projected/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-kube-api-access-6rsc2\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.111219 4839 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.404249 4839 generic.go:334] "Generic (PLEG): container finished" podID="439bd408-2f5c-45cc-a2f7-8166a4a279c2" containerID="583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b" exitCode=0 Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.404529 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" event={"ID":"439bd408-2f5c-45cc-a2f7-8166a4a279c2","Type":"ContainerDied","Data":"583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b"} Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.404564 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" event={"ID":"439bd408-2f5c-45cc-a2f7-8166a4a279c2","Type":"ContainerStarted","Data":"21228341591d8e5aec6ec7937412b30e803ee3d8f05a2c9720bd304ab86d36ca"} Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.409998 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82b135c8-5fc8-4930-9577-1dd9181a1dae","Type":"ContainerStarted","Data":"63601f472b5773f8a2494b64c1e43c04ccfd11d451bf08893682392398171dec"} Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.418058 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" event={"ID":"47d5a79e-3e14-4d49-bed4-a9c49e7b7f26","Type":"ContainerDied","Data":"06a1cfd95284ed8daa71d7f99c5f1c2898ca406b61f4a56279166d4934b555c0"} Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.418103 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06a1cfd95284ed8daa71d7f99c5f1c2898ca406b61f4a56279166d4934b555c0" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.418170 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.439892 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67dd687666-pgfc5" event={"ID":"10c8735c-f1c9-40f7-bd34-60bb0749bc23","Type":"ContainerDied","Data":"eef1fbdee77ab0e9e93f444be08641661339d7257383b84dc3aa743e29072b31"} Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.439939 4839 scope.go:117] "RemoveContainer" containerID="77ad1711dc27b34bdfe011c55c79bdc7d6ef8a5e0c42e7951254c65ef19efa51" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.440042 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-67dd687666-pgfc5" Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.447795 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"63efa50f-a0e7-4912-bbd8-c610daf572fd","Type":"ContainerStarted","Data":"6c8ae425c983b2b78de1cbadfa55da8fe679ecb65a24d6c38077401c6093e407"} Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.508819 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-67dd687666-pgfc5"] Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.533223 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-67dd687666-pgfc5"] Mar 21 04:45:07 crc kubenswrapper[4839]: I0321 04:45:07.559551 4839 scope.go:117] "RemoveContainer" containerID="5b137b53eba217c749f810e3fe6d4536182b4cec7923324d43b649cbc888ca03" Mar 21 04:45:08 crc kubenswrapper[4839]: I0321 04:45:08.441435 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 21 04:45:08 crc kubenswrapper[4839]: I0321 04:45:08.472396 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" path="/var/lib/kubelet/pods/10c8735c-f1c9-40f7-bd34-60bb0749bc23/volumes" Mar 21 04:45:08 crc kubenswrapper[4839]: I0321 04:45:08.473508 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82b135c8-5fc8-4930-9577-1dd9181a1dae","Type":"ContainerStarted","Data":"94e949d149a7c44ebe62d3aa61b17816324f2bfff6307c3c5a788a6a257442fa"} Mar 21 04:45:08 crc kubenswrapper[4839]: I0321 04:45:08.476470 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"63efa50f-a0e7-4912-bbd8-c610daf572fd","Type":"ContainerStarted","Data":"bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5"} Mar 21 04:45:08 crc kubenswrapper[4839]: I0321 04:45:08.478098 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" event={"ID":"439bd408-2f5c-45cc-a2f7-8166a4a279c2","Type":"ContainerStarted","Data":"c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18"} Mar 21 04:45:08 crc kubenswrapper[4839]: I0321 04:45:08.478218 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:08 crc kubenswrapper[4839]: I0321 04:45:08.501240 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" podStartSLOduration=3.501214752 podStartE2EDuration="3.501214752s" podCreationTimestamp="2026-03-21 04:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:08.49682835 +0000 UTC m=+1312.824615046" watchObservedRunningTime="2026-03-21 04:45:08.501214752 +0000 UTC m=+1312.829001428" Mar 21 04:45:09 crc kubenswrapper[4839]: I0321 04:45:09.495797 4839 generic.go:334] "Generic (PLEG): container finished" podID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" containerID="0a64d9a20f4f5b5d0b9782608a440c655769c9db2754bb98b7278494dc83ae14" exitCode=0 Mar 21 04:45:09 crc kubenswrapper[4839]: I0321 04:45:09.495890 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c266726-5bfd-4519-bdd5-9db7f6a77df4","Type":"ContainerDied","Data":"0a64d9a20f4f5b5d0b9782608a440c655769c9db2754bb98b7278494dc83ae14"} Mar 21 04:45:09 crc kubenswrapper[4839]: I0321 04:45:09.505064 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"63efa50f-a0e7-4912-bbd8-c610daf572fd","Type":"ContainerStarted","Data":"cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c"} Mar 21 04:45:09 crc kubenswrapper[4839]: I0321 04:45:09.505188 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="63efa50f-a0e7-4912-bbd8-c610daf572fd" containerName="cinder-api-log" containerID="cri-o://bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5" gracePeriod=30 Mar 21 04:45:09 crc kubenswrapper[4839]: I0321 04:45:09.505281 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="63efa50f-a0e7-4912-bbd8-c610daf572fd" containerName="cinder-api" containerID="cri-o://cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c" gracePeriod=30 Mar 21 04:45:09 crc kubenswrapper[4839]: I0321 04:45:09.505431 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 21 04:45:09 crc kubenswrapper[4839]: I0321 04:45:09.519285 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82b135c8-5fc8-4930-9577-1dd9181a1dae","Type":"ContainerStarted","Data":"29fe57ad79066c4ceb1edc999f7c370920b15cd9f7e5562d2b7d130cb45455af"} Mar 21 04:45:09 crc kubenswrapper[4839]: I0321 04:45:09.525792 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.525778032 podStartE2EDuration="4.525778032s" podCreationTimestamp="2026-03-21 04:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:09.524075135 +0000 UTC m=+1313.851861831" watchObservedRunningTime="2026-03-21 04:45:09.525778032 +0000 UTC m=+1313.853564708" Mar 21 04:45:09 crc kubenswrapper[4839]: I0321 04:45:09.559260 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.962041362 podStartE2EDuration="4.559210228s" podCreationTimestamp="2026-03-21 04:45:05 +0000 UTC" firstStartedPulling="2026-03-21 04:45:06.75640487 +0000 UTC m=+1311.084191546" lastFinishedPulling="2026-03-21 04:45:07.353573736 +0000 UTC m=+1311.681360412" observedRunningTime="2026-03-21 04:45:09.557611753 +0000 UTC m=+1313.885398429" watchObservedRunningTime="2026-03-21 04:45:09.559210228 +0000 UTC m=+1313.886996904" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.040771 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.101840 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-run-httpd\") pod \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.101913 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-log-httpd\") pod \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.101952 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-sg-core-conf-yaml\") pod \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.102053 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-scripts\") pod \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.102071 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-config-data\") pod \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.102119 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-combined-ca-bundle\") pod \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.102193 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zddh\" (UniqueName: \"kubernetes.io/projected/6c266726-5bfd-4519-bdd5-9db7f6a77df4-kube-api-access-5zddh\") pod \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\" (UID: \"6c266726-5bfd-4519-bdd5-9db7f6a77df4\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.102238 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6c266726-5bfd-4519-bdd5-9db7f6a77df4" (UID: "6c266726-5bfd-4519-bdd5-9db7f6a77df4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.102249 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6c266726-5bfd-4519-bdd5-9db7f6a77df4" (UID: "6c266726-5bfd-4519-bdd5-9db7f6a77df4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.102559 4839 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.102590 4839 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c266726-5bfd-4519-bdd5-9db7f6a77df4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.112707 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-scripts" (OuterVolumeSpecName: "scripts") pod "6c266726-5bfd-4519-bdd5-9db7f6a77df4" (UID: "6c266726-5bfd-4519-bdd5-9db7f6a77df4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.156613 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c266726-5bfd-4519-bdd5-9db7f6a77df4" (UID: "6c266726-5bfd-4519-bdd5-9db7f6a77df4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.156662 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c266726-5bfd-4519-bdd5-9db7f6a77df4-kube-api-access-5zddh" (OuterVolumeSpecName: "kube-api-access-5zddh") pod "6c266726-5bfd-4519-bdd5-9db7f6a77df4" (UID: "6c266726-5bfd-4519-bdd5-9db7f6a77df4"). InnerVolumeSpecName "kube-api-access-5zddh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.165121 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6c266726-5bfd-4519-bdd5-9db7f6a77df4" (UID: "6c266726-5bfd-4519-bdd5-9db7f6a77df4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.168446 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-config-data" (OuterVolumeSpecName: "config-data") pod "6c266726-5bfd-4519-bdd5-9db7f6a77df4" (UID: "6c266726-5bfd-4519-bdd5-9db7f6a77df4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.205994 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.206031 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zddh\" (UniqueName: \"kubernetes.io/projected/6c266726-5bfd-4519-bdd5-9db7f6a77df4-kube-api-access-5zddh\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.206043 4839 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.206051 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.206063 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c266726-5bfd-4519-bdd5-9db7f6a77df4-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.279547 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.409373 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-combined-ca-bundle\") pod \"63efa50f-a0e7-4912-bbd8-c610daf572fd\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.409489 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-scripts\") pod \"63efa50f-a0e7-4912-bbd8-c610daf572fd\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.409631 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63efa50f-a0e7-4912-bbd8-c610daf572fd-logs\") pod \"63efa50f-a0e7-4912-bbd8-c610daf572fd\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.409688 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data\") pod \"63efa50f-a0e7-4912-bbd8-c610daf572fd\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.409744 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmpmm\" (UniqueName: \"kubernetes.io/projected/63efa50f-a0e7-4912-bbd8-c610daf572fd-kube-api-access-kmpmm\") pod \"63efa50f-a0e7-4912-bbd8-c610daf572fd\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.409780 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63efa50f-a0e7-4912-bbd8-c610daf572fd-etc-machine-id\") pod \"63efa50f-a0e7-4912-bbd8-c610daf572fd\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.409831 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data-custom\") pod \"63efa50f-a0e7-4912-bbd8-c610daf572fd\" (UID: \"63efa50f-a0e7-4912-bbd8-c610daf572fd\") " Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.409925 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63efa50f-a0e7-4912-bbd8-c610daf572fd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "63efa50f-a0e7-4912-bbd8-c610daf572fd" (UID: "63efa50f-a0e7-4912-bbd8-c610daf572fd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.410294 4839 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63efa50f-a0e7-4912-bbd8-c610daf572fd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.410370 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63efa50f-a0e7-4912-bbd8-c610daf572fd-logs" (OuterVolumeSpecName: "logs") pod "63efa50f-a0e7-4912-bbd8-c610daf572fd" (UID: "63efa50f-a0e7-4912-bbd8-c610daf572fd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.414765 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63efa50f-a0e7-4912-bbd8-c610daf572fd-kube-api-access-kmpmm" (OuterVolumeSpecName: "kube-api-access-kmpmm") pod "63efa50f-a0e7-4912-bbd8-c610daf572fd" (UID: "63efa50f-a0e7-4912-bbd8-c610daf572fd"). InnerVolumeSpecName "kube-api-access-kmpmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.415252 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-scripts" (OuterVolumeSpecName: "scripts") pod "63efa50f-a0e7-4912-bbd8-c610daf572fd" (UID: "63efa50f-a0e7-4912-bbd8-c610daf572fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.415554 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "63efa50f-a0e7-4912-bbd8-c610daf572fd" (UID: "63efa50f-a0e7-4912-bbd8-c610daf572fd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.432991 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63efa50f-a0e7-4912-bbd8-c610daf572fd" (UID: "63efa50f-a0e7-4912-bbd8-c610daf572fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.472937 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data" (OuterVolumeSpecName: "config-data") pod "63efa50f-a0e7-4912-bbd8-c610daf572fd" (UID: "63efa50f-a0e7-4912-bbd8-c610daf572fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.512072 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.512122 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.512136 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63efa50f-a0e7-4912-bbd8-c610daf572fd-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.512146 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.512156 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmpmm\" (UniqueName: \"kubernetes.io/projected/63efa50f-a0e7-4912-bbd8-c610daf572fd-kube-api-access-kmpmm\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.512170 4839 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63efa50f-a0e7-4912-bbd8-c610daf572fd-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.535773 4839 generic.go:334] "Generic (PLEG): container finished" podID="63efa50f-a0e7-4912-bbd8-c610daf572fd" containerID="cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c" exitCode=0 Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.535808 4839 generic.go:334] "Generic (PLEG): container finished" podID="63efa50f-a0e7-4912-bbd8-c610daf572fd" containerID="bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5" exitCode=143 Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.535849 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"63efa50f-a0e7-4912-bbd8-c610daf572fd","Type":"ContainerDied","Data":"cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c"} Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.535879 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"63efa50f-a0e7-4912-bbd8-c610daf572fd","Type":"ContainerDied","Data":"bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5"} Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.535892 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"63efa50f-a0e7-4912-bbd8-c610daf572fd","Type":"ContainerDied","Data":"6c8ae425c983b2b78de1cbadfa55da8fe679ecb65a24d6c38077401c6093e407"} Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.535912 4839 scope.go:117] "RemoveContainer" containerID="cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.536071 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.546874 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c266726-5bfd-4519-bdd5-9db7f6a77df4","Type":"ContainerDied","Data":"41ed81fbf037f8ebe50fd1cd4bb84f9e7c73f61ee6cb668dca265d806ca14d96"} Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.546970 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.571519 4839 scope.go:117] "RemoveContainer" containerID="bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.611954 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.615623 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.631802 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.643255 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.656463 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:10 crc kubenswrapper[4839]: E0321 04:45:10.656942 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" containerName="ceilometer-notification-agent" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.656967 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" containerName="ceilometer-notification-agent" Mar 21 04:45:10 crc kubenswrapper[4839]: E0321 04:45:10.656984 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63efa50f-a0e7-4912-bbd8-c610daf572fd" containerName="cinder-api" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.656992 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="63efa50f-a0e7-4912-bbd8-c610daf572fd" containerName="cinder-api" Mar 21 04:45:10 crc kubenswrapper[4839]: E0321 04:45:10.657014 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63efa50f-a0e7-4912-bbd8-c610daf572fd" containerName="cinder-api-log" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657022 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="63efa50f-a0e7-4912-bbd8-c610daf572fd" containerName="cinder-api-log" Mar 21 04:45:10 crc kubenswrapper[4839]: E0321 04:45:10.657041 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerName="barbican-api" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657048 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerName="barbican-api" Mar 21 04:45:10 crc kubenswrapper[4839]: E0321 04:45:10.657071 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" containerName="sg-core" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657080 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" containerName="sg-core" Mar 21 04:45:10 crc kubenswrapper[4839]: E0321 04:45:10.657097 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerName="barbican-api-log" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657104 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerName="barbican-api-log" Mar 21 04:45:10 crc kubenswrapper[4839]: E0321 04:45:10.657114 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d5a79e-3e14-4d49-bed4-a9c49e7b7f26" containerName="collect-profiles" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657121 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d5a79e-3e14-4d49-bed4-a9c49e7b7f26" containerName="collect-profiles" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657323 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerName="barbican-api" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657344 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" containerName="ceilometer-notification-agent" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657362 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" containerName="sg-core" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657373 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c8735c-f1c9-40f7-bd34-60bb0749bc23" containerName="barbican-api-log" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657384 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="63efa50f-a0e7-4912-bbd8-c610daf572fd" containerName="cinder-api-log" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657401 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="47d5a79e-3e14-4d49-bed4-a9c49e7b7f26" containerName="collect-profiles" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.657416 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="63efa50f-a0e7-4912-bbd8-c610daf572fd" containerName="cinder-api" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.658234 4839 scope.go:117] "RemoveContainer" containerID="cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c" Mar 21 04:45:10 crc kubenswrapper[4839]: E0321 04:45:10.658744 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c\": container with ID starting with cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c not found: ID does not exist" containerID="cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.658777 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c"} err="failed to get container status \"cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c\": rpc error: code = NotFound desc = could not find container \"cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c\": container with ID starting with cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c not found: ID does not exist" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.658802 4839 scope.go:117] "RemoveContainer" containerID="bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5" Mar 21 04:45:10 crc kubenswrapper[4839]: E0321 04:45:10.659082 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5\": container with ID starting with bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5 not found: ID does not exist" containerID="bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.659108 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5"} err="failed to get container status \"bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5\": rpc error: code = NotFound desc = could not find container \"bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5\": container with ID starting with bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5 not found: ID does not exist" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.659126 4839 scope.go:117] "RemoveContainer" containerID="cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.659370 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c"} err="failed to get container status \"cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c\": rpc error: code = NotFound desc = could not find container \"cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c\": container with ID starting with cd27cd089c3832825f90b0c44877ec010b880b3484062118b2e7c9a67fa21c1c not found: ID does not exist" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.659395 4839 scope.go:117] "RemoveContainer" containerID="bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.659585 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.659627 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5"} err="failed to get container status \"bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5\": rpc error: code = NotFound desc = could not find container \"bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5\": container with ID starting with bbbc59bd3ae95b686894b1a781c124d75cbce182408b10ddc7aaf1fc47510ff5 not found: ID does not exist" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.659646 4839 scope.go:117] "RemoveContainer" containerID="97884e844baf73e80ed5f7a5c51d988d7d7009365523dceffa2a7bc9d1e19948" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.696130 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.726113 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-run-httpd\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.726166 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-log-httpd\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.726209 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.726279 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmhp2\" (UniqueName: \"kubernetes.io/projected/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-kube-api-access-vmhp2\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.726348 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-scripts\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.726428 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.726719 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-config-data\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.729505 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.749818 4839 scope.go:117] "RemoveContainer" containerID="0a64d9a20f4f5b5d0b9782608a440c655769c9db2754bb98b7278494dc83ae14" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.785318 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.815120 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.816992 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.826502 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.826905 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.826937 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.833209 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmhp2\" (UniqueName: \"kubernetes.io/projected/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-kube-api-access-vmhp2\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.833294 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-scripts\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.833331 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.833390 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-config-data\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.833518 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-run-httpd\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.833546 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-log-httpd\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.833603 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.834964 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-run-httpd\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.841126 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-scripts\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.841913 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.850539 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-log-httpd\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.852818 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.854052 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-config-data\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.855036 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.878338 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmhp2\" (UniqueName: \"kubernetes.io/projected/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-kube-api-access-vmhp2\") pod \"ceilometer-0\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " pod="openstack/ceilometer-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.935545 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-config-data-custom\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.935635 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.935664 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5162af3c-3b00-4643-afd9-680f6e2f5c03-logs\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.935719 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-scripts\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.935739 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.935770 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.935853 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvp2q\" (UniqueName: \"kubernetes.io/projected/5162af3c-3b00-4643-afd9-680f6e2f5c03-kube-api-access-vvp2q\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.935985 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5162af3c-3b00-4643-afd9-680f6e2f5c03-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:10 crc kubenswrapper[4839]: I0321 04:45:10.936024 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-config-data\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.037829 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvp2q\" (UniqueName: \"kubernetes.io/projected/5162af3c-3b00-4643-afd9-680f6e2f5c03-kube-api-access-vvp2q\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.038177 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5162af3c-3b00-4643-afd9-680f6e2f5c03-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.038265 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-config-data\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.038352 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-config-data-custom\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.038476 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.038556 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5162af3c-3b00-4643-afd9-680f6e2f5c03-logs\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.038681 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-scripts\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.038746 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.038823 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.038430 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5162af3c-3b00-4643-afd9-680f6e2f5c03-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.040035 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5162af3c-3b00-4643-afd9-680f6e2f5c03-logs\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.042599 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.043790 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-config-data\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.044087 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-config-data-custom\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.044235 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.048282 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-scripts\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.055694 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5162af3c-3b00-4643-afd9-680f6e2f5c03-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.059468 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvp2q\" (UniqueName: \"kubernetes.io/projected/5162af3c-3b00-4643-afd9-680f6e2f5c03-kube-api-access-vvp2q\") pod \"cinder-api-0\" (UID: \"5162af3c-3b00-4643-afd9-680f6e2f5c03\") " pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.075145 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.087916 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.276797 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.530789 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:45:11 crc kubenswrapper[4839]: W0321 04:45:11.530914 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4b205eb_a84a_4c2f_8b49_068d4e0a8ec9.slice/crio-c2ee73cd19c0a119a6c2d0a87e1fbe0c6315a3f99b97a20c1aa91548c3c2c2dc WatchSource:0}: Error finding container c2ee73cd19c0a119a6c2d0a87e1fbe0c6315a3f99b97a20c1aa91548c3c2c2dc: Status 404 returned error can't find the container with id c2ee73cd19c0a119a6c2d0a87e1fbe0c6315a3f99b97a20c1aa91548c3c2c2dc Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.531588 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.561542 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9","Type":"ContainerStarted","Data":"c2ee73cd19c0a119a6c2d0a87e1fbe0c6315a3f99b97a20c1aa91548c3c2c2dc"} Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.699326 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 21 04:45:11 crc kubenswrapper[4839]: W0321 04:45:11.702209 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5162af3c_3b00_4643_afd9_680f6e2f5c03.slice/crio-9ef657035e164bc132f225494533c3caf524d329fd3782910ff8fe4a3680f936 WatchSource:0}: Error finding container 9ef657035e164bc132f225494533c3caf524d329fd3782910ff8fe4a3680f936: Status 404 returned error can't find the container with id 9ef657035e164bc132f225494533c3caf524d329fd3782910ff8fe4a3680f936 Mar 21 04:45:11 crc kubenswrapper[4839]: I0321 04:45:11.720777 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:45:12 crc kubenswrapper[4839]: I0321 04:45:12.465501 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63efa50f-a0e7-4912-bbd8-c610daf572fd" path="/var/lib/kubelet/pods/63efa50f-a0e7-4912-bbd8-c610daf572fd/volumes" Mar 21 04:45:12 crc kubenswrapper[4839]: I0321 04:45:12.466476 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c266726-5bfd-4519-bdd5-9db7f6a77df4" path="/var/lib/kubelet/pods/6c266726-5bfd-4519-bdd5-9db7f6a77df4/volumes" Mar 21 04:45:12 crc kubenswrapper[4839]: I0321 04:45:12.582445 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5162af3c-3b00-4643-afd9-680f6e2f5c03","Type":"ContainerStarted","Data":"857a3c055c4bbb192eb5427ba4b0f790e0a6fefcec9392f76c0e8b227ed287ad"} Mar 21 04:45:12 crc kubenswrapper[4839]: I0321 04:45:12.582492 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5162af3c-3b00-4643-afd9-680f6e2f5c03","Type":"ContainerStarted","Data":"9ef657035e164bc132f225494533c3caf524d329fd3782910ff8fe4a3680f936"} Mar 21 04:45:13 crc kubenswrapper[4839]: I0321 04:45:13.451187 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-9c97f4dbd-k2scs" Mar 21 04:45:13 crc kubenswrapper[4839]: I0321 04:45:13.578332 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84c6c985f8-v5cmh"] Mar 21 04:45:13 crc kubenswrapper[4839]: I0321 04:45:13.578968 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-84c6c985f8-v5cmh" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon-log" containerID="cri-o://0bc7ef10848b0da5e68b6c3552cc343013046d2176bf665b0d2389f263149510" gracePeriod=30 Mar 21 04:45:13 crc kubenswrapper[4839]: I0321 04:45:13.579475 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-84c6c985f8-v5cmh" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" containerID="cri-o://e004b9646c4df34c1d5bba67912a6fa76f3cccc25c7980ab777e369e37ce16c9" gracePeriod=30 Mar 21 04:45:13 crc kubenswrapper[4839]: I0321 04:45:13.585419 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84c6c985f8-v5cmh" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Mar 21 04:45:13 crc kubenswrapper[4839]: I0321 04:45:13.612411 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5162af3c-3b00-4643-afd9-680f6e2f5c03","Type":"ContainerStarted","Data":"fdc9a5f2c9f812e6c489afb2deb5cfd846ff8ba1578516379dd6a810bb73aecc"} Mar 21 04:45:13 crc kubenswrapper[4839]: I0321 04:45:13.613749 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 21 04:45:13 crc kubenswrapper[4839]: I0321 04:45:13.615963 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9","Type":"ContainerStarted","Data":"535723ef61f58618feb7059f038c3fae7ab1b8f214f13d762b44e16ad7930481"} Mar 21 04:45:13 crc kubenswrapper[4839]: I0321 04:45:13.615995 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9","Type":"ContainerStarted","Data":"c93a95cdff62e2e7b6e4859da043de79cf7d681e70275fda634ef213fa4a479c"} Mar 21 04:45:13 crc kubenswrapper[4839]: I0321 04:45:13.647166 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.647147298 podStartE2EDuration="3.647147298s" podCreationTimestamp="2026-03-21 04:45:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:13.633326351 +0000 UTC m=+1317.961113027" watchObservedRunningTime="2026-03-21 04:45:13.647147298 +0000 UTC m=+1317.974933974" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.193319 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.218125 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.478407 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-75bd8b89b4-djjlh"] Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.482472 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.509412 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-75bd8b89b4-djjlh"] Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.636977 4839 generic.go:334] "Generic (PLEG): container finished" podID="e965d008-890b-408c-a5a8-823aca00140a" containerID="6e416952cf65a99f24d43cb637a81bb2e071806b75507c88029a3d669986edf2" exitCode=0 Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.637478 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98964f649-mrjrt" event={"ID":"e965d008-890b-408c-a5a8-823aca00140a","Type":"ContainerDied","Data":"6e416952cf65a99f24d43cb637a81bb2e071806b75507c88029a3d669986edf2"} Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.667807 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-config-data\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.667905 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-public-tls-certs\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.667940 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf5a44f8-8eb1-4953-b611-a02576e414ea-logs\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.667994 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdqxc\" (UniqueName: \"kubernetes.io/projected/bf5a44f8-8eb1-4953-b611-a02576e414ea-kube-api-access-jdqxc\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.668090 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-internal-tls-certs\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.668120 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-scripts\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.668154 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-combined-ca-bundle\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.770824 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-internal-tls-certs\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.771103 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-scripts\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.771168 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-combined-ca-bundle\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.772944 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-config-data\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.773130 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-public-tls-certs\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.773187 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf5a44f8-8eb1-4953-b611-a02576e414ea-logs\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.773301 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdqxc\" (UniqueName: \"kubernetes.io/projected/bf5a44f8-8eb1-4953-b611-a02576e414ea-kube-api-access-jdqxc\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.773561 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf5a44f8-8eb1-4953-b611-a02576e414ea-logs\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.777428 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-config-data\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.779178 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-internal-tls-certs\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.780096 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-scripts\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.790297 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-public-tls-certs\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.791127 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdqxc\" (UniqueName: \"kubernetes.io/projected/bf5a44f8-8eb1-4953-b611-a02576e414ea-kube-api-access-jdqxc\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.798193 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf5a44f8-8eb1-4953-b611-a02576e414ea-combined-ca-bundle\") pod \"placement-75bd8b89b4-djjlh\" (UID: \"bf5a44f8-8eb1-4953-b611-a02576e414ea\") " pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.855008 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.917066 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.978445 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-public-tls-certs\") pod \"e965d008-890b-408c-a5a8-823aca00140a\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.978542 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-config\") pod \"e965d008-890b-408c-a5a8-823aca00140a\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.978607 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-internal-tls-certs\") pod \"e965d008-890b-408c-a5a8-823aca00140a\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.978632 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-ovndb-tls-certs\") pod \"e965d008-890b-408c-a5a8-823aca00140a\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.978772 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-combined-ca-bundle\") pod \"e965d008-890b-408c-a5a8-823aca00140a\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.978878 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbt6d\" (UniqueName: \"kubernetes.io/projected/e965d008-890b-408c-a5a8-823aca00140a-kube-api-access-wbt6d\") pod \"e965d008-890b-408c-a5a8-823aca00140a\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.978893 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-httpd-config\") pod \"e965d008-890b-408c-a5a8-823aca00140a\" (UID: \"e965d008-890b-408c-a5a8-823aca00140a\") " Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.987074 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e965d008-890b-408c-a5a8-823aca00140a-kube-api-access-wbt6d" (OuterVolumeSpecName: "kube-api-access-wbt6d") pod "e965d008-890b-408c-a5a8-823aca00140a" (UID: "e965d008-890b-408c-a5a8-823aca00140a"). InnerVolumeSpecName "kube-api-access-wbt6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.987697 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e965d008-890b-408c-a5a8-823aca00140a" (UID: "e965d008-890b-408c-a5a8-823aca00140a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:15 crc kubenswrapper[4839]: I0321 04:45:15.992757 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.071761 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e965d008-890b-408c-a5a8-823aca00140a" (UID: "e965d008-890b-408c-a5a8-823aca00140a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.084775 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e965d008-890b-408c-a5a8-823aca00140a" (UID: "e965d008-890b-408c-a5a8-823aca00140a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.085063 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbt6d\" (UniqueName: \"kubernetes.io/projected/e965d008-890b-408c-a5a8-823aca00140a-kube-api-access-wbt6d\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.085100 4839 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.085110 4839 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.085123 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.125363 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-config" (OuterVolumeSpecName: "config") pod "e965d008-890b-408c-a5a8-823aca00140a" (UID: "e965d008-890b-408c-a5a8-823aca00140a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.137004 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-k67ln"] Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.137366 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" podUID="ac45c53b-2486-47d1-aaf4-23b76adfd431" containerName="dnsmasq-dns" containerID="cri-o://d7d86bc6d96470a04c1fc681cf73561b422455dc884417b0677e9ae418f682f0" gracePeriod=10 Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.158931 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e965d008-890b-408c-a5a8-823aca00140a" (UID: "e965d008-890b-408c-a5a8-823aca00140a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.191889 4839 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.191923 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.233445 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e965d008-890b-408c-a5a8-823aca00140a" (UID: "e965d008-890b-408c-a5a8-823aca00140a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.292612 4839 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e965d008-890b-408c-a5a8-823aca00140a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.542105 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.554655 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-75bd8b89b4-djjlh"] Mar 21 04:45:16 crc kubenswrapper[4839]: W0321 04:45:16.568867 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf5a44f8_8eb1_4953_b611_a02576e414ea.slice/crio-8b4121a67833b28031e0850228e8d0dd605437ca3e0db7dd43ea487dc7db5f3b WatchSource:0}: Error finding container 8b4121a67833b28031e0850228e8d0dd605437ca3e0db7dd43ea487dc7db5f3b: Status 404 returned error can't find the container with id 8b4121a67833b28031e0850228e8d0dd605437ca3e0db7dd43ea487dc7db5f3b Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.620906 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.742105 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9","Type":"ContainerStarted","Data":"c23f136b850cb236ed5c6370a36a398b1c3d6f65a75d6f5afc35d4214b4f0367"} Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.751895 4839 generic.go:334] "Generic (PLEG): container finished" podID="ac45c53b-2486-47d1-aaf4-23b76adfd431" containerID="d7d86bc6d96470a04c1fc681cf73561b422455dc884417b0677e9ae418f682f0" exitCode=0 Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.751994 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" event={"ID":"ac45c53b-2486-47d1-aaf4-23b76adfd431","Type":"ContainerDied","Data":"d7d86bc6d96470a04c1fc681cf73561b422455dc884417b0677e9ae418f682f0"} Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.752066 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" event={"ID":"ac45c53b-2486-47d1-aaf4-23b76adfd431","Type":"ContainerDied","Data":"2f1d63d47cab7235a54c65fca44b344c795fcf2a4d8ecfd84ead6214de4729cf"} Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.752085 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f1d63d47cab7235a54c65fca44b344c795fcf2a4d8ecfd84ead6214de4729cf" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.752823 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.754504 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98964f649-mrjrt" event={"ID":"e965d008-890b-408c-a5a8-823aca00140a","Type":"ContainerDied","Data":"965f07a77abecc2dcc57bce66cbf446672e1ba03feecba479f6dc24ff5964cee"} Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.754622 4839 scope.go:117] "RemoveContainer" containerID="c52f7b158358ef8b38cfac03210bf15a4ca76a8dbb9c567dbd73763e507062d1" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.754780 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-98964f649-mrjrt" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.759885 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75bd8b89b4-djjlh" event={"ID":"bf5a44f8-8eb1-4953-b611-a02576e414ea","Type":"ContainerStarted","Data":"8b4121a67833b28031e0850228e8d0dd605437ca3e0db7dd43ea487dc7db5f3b"} Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.760005 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="82b135c8-5fc8-4930-9577-1dd9181a1dae" containerName="cinder-scheduler" containerID="cri-o://94e949d149a7c44ebe62d3aa61b17816324f2bfff6307c3c5a788a6a257442fa" gracePeriod=30 Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.760088 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="82b135c8-5fc8-4930-9577-1dd9181a1dae" containerName="probe" containerID="cri-o://29fe57ad79066c4ceb1edc999f7c370920b15cd9f7e5562d2b7d130cb45455af" gracePeriod=30 Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.791141 4839 scope.go:117] "RemoveContainer" containerID="6e416952cf65a99f24d43cb637a81bb2e071806b75507c88029a3d669986edf2" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.817797 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-98964f649-mrjrt"] Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.834628 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-98964f649-mrjrt"] Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.843712 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-nb\") pod \"ac45c53b-2486-47d1-aaf4-23b76adfd431\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.844116 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-svc\") pod \"ac45c53b-2486-47d1-aaf4-23b76adfd431\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.844198 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-swift-storage-0\") pod \"ac45c53b-2486-47d1-aaf4-23b76adfd431\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.844279 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-sb\") pod \"ac45c53b-2486-47d1-aaf4-23b76adfd431\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.844331 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-config\") pod \"ac45c53b-2486-47d1-aaf4-23b76adfd431\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.844388 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn7xh\" (UniqueName: \"kubernetes.io/projected/ac45c53b-2486-47d1-aaf4-23b76adfd431-kube-api-access-xn7xh\") pod \"ac45c53b-2486-47d1-aaf4-23b76adfd431\" (UID: \"ac45c53b-2486-47d1-aaf4-23b76adfd431\") " Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.853449 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac45c53b-2486-47d1-aaf4-23b76adfd431-kube-api-access-xn7xh" (OuterVolumeSpecName: "kube-api-access-xn7xh") pod "ac45c53b-2486-47d1-aaf4-23b76adfd431" (UID: "ac45c53b-2486-47d1-aaf4-23b76adfd431"). InnerVolumeSpecName "kube-api-access-xn7xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.932201 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ac45c53b-2486-47d1-aaf4-23b76adfd431" (UID: "ac45c53b-2486-47d1-aaf4-23b76adfd431"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.933799 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ac45c53b-2486-47d1-aaf4-23b76adfd431" (UID: "ac45c53b-2486-47d1-aaf4-23b76adfd431"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.943008 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ac45c53b-2486-47d1-aaf4-23b76adfd431" (UID: "ac45c53b-2486-47d1-aaf4-23b76adfd431"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.946920 4839 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.947324 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.947473 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn7xh\" (UniqueName: \"kubernetes.io/projected/ac45c53b-2486-47d1-aaf4-23b76adfd431-kube-api-access-xn7xh\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.947733 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.950058 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ac45c53b-2486-47d1-aaf4-23b76adfd431" (UID: "ac45c53b-2486-47d1-aaf4-23b76adfd431"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.974064 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-config" (OuterVolumeSpecName: "config") pod "ac45c53b-2486-47d1-aaf4-23b76adfd431" (UID: "ac45c53b-2486-47d1-aaf4-23b76adfd431"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:16 crc kubenswrapper[4839]: I0321 04:45:16.983789 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84c6c985f8-v5cmh" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:40434->10.217.0.151:8443: read: connection reset by peer" Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.053071 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.053110 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac45c53b-2486-47d1-aaf4-23b76adfd431-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.771910 4839 generic.go:334] "Generic (PLEG): container finished" podID="82b135c8-5fc8-4930-9577-1dd9181a1dae" containerID="29fe57ad79066c4ceb1edc999f7c370920b15cd9f7e5562d2b7d130cb45455af" exitCode=0 Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.772103 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82b135c8-5fc8-4930-9577-1dd9181a1dae","Type":"ContainerDied","Data":"29fe57ad79066c4ceb1edc999f7c370920b15cd9f7e5562d2b7d130cb45455af"} Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.774234 4839 generic.go:334] "Generic (PLEG): container finished" podID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerID="e004b9646c4df34c1d5bba67912a6fa76f3cccc25c7980ab777e369e37ce16c9" exitCode=0 Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.774286 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84c6c985f8-v5cmh" event={"ID":"b3b26c3a-55d5-442a-9c31-187b0aa60f90","Type":"ContainerDied","Data":"e004b9646c4df34c1d5bba67912a6fa76f3cccc25c7980ab777e369e37ce16c9"} Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.777396 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.783742 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75bd8b89b4-djjlh" event={"ID":"bf5a44f8-8eb1-4953-b611-a02576e414ea","Type":"ContainerStarted","Data":"371fbcf0e25d611c8e7011aab1b093aa65787f92921f45d7a12f88bef0f0c11a"} Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.784106 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75bd8b89b4-djjlh" event={"ID":"bf5a44f8-8eb1-4953-b611-a02576e414ea","Type":"ContainerStarted","Data":"1582457b3b52da6361fec62b9a2f0adae91fef69d3f2274a400f4dcc5a17c915"} Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.784209 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.784307 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.837785 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-75bd8b89b4-djjlh" podStartSLOduration=2.837766972 podStartE2EDuration="2.837766972s" podCreationTimestamp="2026-03-21 04:45:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:17.834374867 +0000 UTC m=+1322.162161543" watchObservedRunningTime="2026-03-21 04:45:17.837766972 +0000 UTC m=+1322.165553648" Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.881545 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-k67ln"] Mar 21 04:45:17 crc kubenswrapper[4839]: I0321 04:45:17.888804 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-k67ln"] Mar 21 04:45:18 crc kubenswrapper[4839]: I0321 04:45:18.466305 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac45c53b-2486-47d1-aaf4-23b76adfd431" path="/var/lib/kubelet/pods/ac45c53b-2486-47d1-aaf4-23b76adfd431/volumes" Mar 21 04:45:18 crc kubenswrapper[4839]: I0321 04:45:18.467480 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e965d008-890b-408c-a5a8-823aca00140a" path="/var/lib/kubelet/pods/e965d008-890b-408c-a5a8-823aca00140a/volumes" Mar 21 04:45:18 crc kubenswrapper[4839]: I0321 04:45:18.613504 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-cb996784d-fvhvp" Mar 21 04:45:18 crc kubenswrapper[4839]: I0321 04:45:18.789196 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9","Type":"ContainerStarted","Data":"4474b8a6d0b40b2af1d9b2dc57ca277c0c905892995cd020658c486bec967d7a"} Mar 21 04:45:18 crc kubenswrapper[4839]: I0321 04:45:18.814448 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.815438702 podStartE2EDuration="8.814431035s" podCreationTimestamp="2026-03-21 04:45:10 +0000 UTC" firstStartedPulling="2026-03-21 04:45:11.53318052 +0000 UTC m=+1315.860967186" lastFinishedPulling="2026-03-21 04:45:17.532172843 +0000 UTC m=+1321.859959519" observedRunningTime="2026-03-21 04:45:18.812558582 +0000 UTC m=+1323.140345268" watchObservedRunningTime="2026-03-21 04:45:18.814431035 +0000 UTC m=+1323.142217711" Mar 21 04:45:19 crc kubenswrapper[4839]: I0321 04:45:19.261937 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84c6c985f8-v5cmh" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 21 04:45:19 crc kubenswrapper[4839]: I0321 04:45:19.827367 4839 generic.go:334] "Generic (PLEG): container finished" podID="82b135c8-5fc8-4930-9577-1dd9181a1dae" containerID="94e949d149a7c44ebe62d3aa61b17816324f2bfff6307c3c5a788a6a257442fa" exitCode=0 Mar 21 04:45:19 crc kubenswrapper[4839]: I0321 04:45:19.827471 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82b135c8-5fc8-4930-9577-1dd9181a1dae","Type":"ContainerDied","Data":"94e949d149a7c44ebe62d3aa61b17816324f2bfff6307c3c5a788a6a257442fa"} Mar 21 04:45:19 crc kubenswrapper[4839]: I0321 04:45:19.827987 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.293610 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.424594 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-combined-ca-bundle\") pod \"82b135c8-5fc8-4930-9577-1dd9181a1dae\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.424776 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data-custom\") pod \"82b135c8-5fc8-4930-9577-1dd9181a1dae\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.424809 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nhfj\" (UniqueName: \"kubernetes.io/projected/82b135c8-5fc8-4930-9577-1dd9181a1dae-kube-api-access-9nhfj\") pod \"82b135c8-5fc8-4930-9577-1dd9181a1dae\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.424842 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data\") pod \"82b135c8-5fc8-4930-9577-1dd9181a1dae\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.425782 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82b135c8-5fc8-4930-9577-1dd9181a1dae-etc-machine-id\") pod \"82b135c8-5fc8-4930-9577-1dd9181a1dae\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.425894 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-scripts\") pod \"82b135c8-5fc8-4930-9577-1dd9181a1dae\" (UID: \"82b135c8-5fc8-4930-9577-1dd9181a1dae\") " Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.425894 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82b135c8-5fc8-4930-9577-1dd9181a1dae-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "82b135c8-5fc8-4930-9577-1dd9181a1dae" (UID: "82b135c8-5fc8-4930-9577-1dd9181a1dae"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.426517 4839 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82b135c8-5fc8-4930-9577-1dd9181a1dae-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.455992 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "82b135c8-5fc8-4930-9577-1dd9181a1dae" (UID: "82b135c8-5fc8-4930-9577-1dd9181a1dae"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.457330 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82b135c8-5fc8-4930-9577-1dd9181a1dae-kube-api-access-9nhfj" (OuterVolumeSpecName: "kube-api-access-9nhfj") pod "82b135c8-5fc8-4930-9577-1dd9181a1dae" (UID: "82b135c8-5fc8-4930-9577-1dd9181a1dae"). InnerVolumeSpecName "kube-api-access-9nhfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.459722 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-scripts" (OuterVolumeSpecName: "scripts") pod "82b135c8-5fc8-4930-9577-1dd9181a1dae" (UID: "82b135c8-5fc8-4930-9577-1dd9181a1dae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.499545 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82b135c8-5fc8-4930-9577-1dd9181a1dae" (UID: "82b135c8-5fc8-4930-9577-1dd9181a1dae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.529893 4839 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.529932 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nhfj\" (UniqueName: \"kubernetes.io/projected/82b135c8-5fc8-4930-9577-1dd9181a1dae-kube-api-access-9nhfj\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.529947 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.529961 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.576764 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data" (OuterVolumeSpecName: "config-data") pod "82b135c8-5fc8-4930-9577-1dd9181a1dae" (UID: "82b135c8-5fc8-4930-9577-1dd9181a1dae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.632746 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b135c8-5fc8-4930-9577-1dd9181a1dae-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.851942 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82b135c8-5fc8-4930-9577-1dd9181a1dae","Type":"ContainerDied","Data":"63601f472b5773f8a2494b64c1e43c04ccfd11d451bf08893682392398171dec"} Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.851957 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.852215 4839 scope.go:117] "RemoveContainer" containerID="29fe57ad79066c4ceb1edc999f7c370920b15cd9f7e5562d2b7d130cb45455af" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.896627 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.899402 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.901966 4839 scope.go:117] "RemoveContainer" containerID="94e949d149a7c44ebe62d3aa61b17816324f2bfff6307c3c5a788a6a257442fa" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.922926 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 04:45:20 crc kubenswrapper[4839]: E0321 04:45:20.923339 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b135c8-5fc8-4930-9577-1dd9181a1dae" containerName="cinder-scheduler" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923361 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b135c8-5fc8-4930-9577-1dd9181a1dae" containerName="cinder-scheduler" Mar 21 04:45:20 crc kubenswrapper[4839]: E0321 04:45:20.923387 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b135c8-5fc8-4930-9577-1dd9181a1dae" containerName="probe" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923399 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b135c8-5fc8-4930-9577-1dd9181a1dae" containerName="probe" Mar 21 04:45:20 crc kubenswrapper[4839]: E0321 04:45:20.923434 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac45c53b-2486-47d1-aaf4-23b76adfd431" containerName="init" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923442 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac45c53b-2486-47d1-aaf4-23b76adfd431" containerName="init" Mar 21 04:45:20 crc kubenswrapper[4839]: E0321 04:45:20.923461 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e965d008-890b-408c-a5a8-823aca00140a" containerName="neutron-httpd" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923470 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e965d008-890b-408c-a5a8-823aca00140a" containerName="neutron-httpd" Mar 21 04:45:20 crc kubenswrapper[4839]: E0321 04:45:20.923483 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac45c53b-2486-47d1-aaf4-23b76adfd431" containerName="dnsmasq-dns" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923491 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac45c53b-2486-47d1-aaf4-23b76adfd431" containerName="dnsmasq-dns" Mar 21 04:45:20 crc kubenswrapper[4839]: E0321 04:45:20.923514 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e965d008-890b-408c-a5a8-823aca00140a" containerName="neutron-api" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923522 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e965d008-890b-408c-a5a8-823aca00140a" containerName="neutron-api" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923758 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b135c8-5fc8-4930-9577-1dd9181a1dae" containerName="probe" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923782 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e965d008-890b-408c-a5a8-823aca00140a" containerName="neutron-api" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923798 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac45c53b-2486-47d1-aaf4-23b76adfd431" containerName="dnsmasq-dns" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923816 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e965d008-890b-408c-a5a8-823aca00140a" containerName="neutron-httpd" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.923826 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b135c8-5fc8-4930-9577-1dd9181a1dae" containerName="cinder-scheduler" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.924946 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.929777 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 21 04:45:20 crc kubenswrapper[4839]: I0321 04:45:20.938188 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.040185 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hxmg\" (UniqueName: \"kubernetes.io/projected/77964653-d242-4258-b06e-c9cd0fb64d84-kube-api-access-6hxmg\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.040237 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-config-data\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.040270 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.040402 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77964653-d242-4258-b06e-c9cd0fb64d84-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.040487 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-scripts\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.040599 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.142008 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hxmg\" (UniqueName: \"kubernetes.io/projected/77964653-d242-4258-b06e-c9cd0fb64d84-kube-api-access-6hxmg\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.142053 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-config-data\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.142077 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.142102 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77964653-d242-4258-b06e-c9cd0fb64d84-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.142172 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-scripts\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.142223 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.142264 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/77964653-d242-4258-b06e-c9cd0fb64d84-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.148980 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.149176 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.149362 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-scripts\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.151652 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77964653-d242-4258-b06e-c9cd0fb64d84-config-data\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.165167 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hxmg\" (UniqueName: \"kubernetes.io/projected/77964653-d242-4258-b06e-c9cd0fb64d84-kube-api-access-6hxmg\") pod \"cinder-scheduler-0\" (UID: \"77964653-d242-4258-b06e-c9cd0fb64d84\") " pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.258488 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.506836 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-848cf88cfc-k67ln" podUID="ac45c53b-2486-47d1-aaf4-23b76adfd431" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.165:5353: i/o timeout" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.815236 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.839243 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.840543 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.843241 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.843306 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-4vnsh" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.846846 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.866252 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.885337 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"77964653-d242-4258-b06e-c9cd0fb64d84","Type":"ContainerStarted","Data":"1618cf281ea19a9f57a3e69a69a8dc2918fb433aa2cfe42c7bca56c00689cd62"} Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.959663 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.959793 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config-secret\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.959861 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:21 crc kubenswrapper[4839]: I0321 04:45:21.959904 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tlqc\" (UniqueName: \"kubernetes.io/projected/829e2047-17e6-49ec-9baf-1339c0f5aea6-kube-api-access-7tlqc\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.061868 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.061928 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tlqc\" (UniqueName: \"kubernetes.io/projected/829e2047-17e6-49ec-9baf-1339c0f5aea6-kube-api-access-7tlqc\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.062018 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.062073 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config-secret\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.063114 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.074903 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.085283 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config-secret\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.095171 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tlqc\" (UniqueName: \"kubernetes.io/projected/829e2047-17e6-49ec-9baf-1339c0f5aea6-kube-api-access-7tlqc\") pod \"openstackclient\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.125497 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.125795 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.152932 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.211043 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.214461 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.223214 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.268873 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtpr9\" (UniqueName: \"kubernetes.io/projected/52b9f7e1-d86c-457e-9391-eee855a9f7a7-kube-api-access-mtpr9\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.268917 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b9f7e1-d86c-457e-9391-eee855a9f7a7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.268950 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/52b9f7e1-d86c-457e-9391-eee855a9f7a7-openstack-config-secret\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.269122 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/52b9f7e1-d86c-457e-9391-eee855a9f7a7-openstack-config\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: E0321 04:45:22.317243 4839 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 21 04:45:22 crc kubenswrapper[4839]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_829e2047-17e6-49ec-9baf-1339c0f5aea6_0(cccec67cfb8155b62fbd9064b5fd161612d7a40d0d6bccbf8bda1bbbb638ecac): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"cccec67cfb8155b62fbd9064b5fd161612d7a40d0d6bccbf8bda1bbbb638ecac" Netns:"/var/run/netns/a9a1b473-5023-4516-8b95-be260f3e2bc9" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=cccec67cfb8155b62fbd9064b5fd161612d7a40d0d6bccbf8bda1bbbb638ecac;K8S_POD_UID=829e2047-17e6-49ec-9baf-1339c0f5aea6" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/829e2047-17e6-49ec-9baf-1339c0f5aea6]: expected pod UID "829e2047-17e6-49ec-9baf-1339c0f5aea6" but got "52b9f7e1-d86c-457e-9391-eee855a9f7a7" from Kube API Mar 21 04:45:22 crc kubenswrapper[4839]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 21 04:45:22 crc kubenswrapper[4839]: > Mar 21 04:45:22 crc kubenswrapper[4839]: E0321 04:45:22.317316 4839 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 21 04:45:22 crc kubenswrapper[4839]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_829e2047-17e6-49ec-9baf-1339c0f5aea6_0(cccec67cfb8155b62fbd9064b5fd161612d7a40d0d6bccbf8bda1bbbb638ecac): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"cccec67cfb8155b62fbd9064b5fd161612d7a40d0d6bccbf8bda1bbbb638ecac" Netns:"/var/run/netns/a9a1b473-5023-4516-8b95-be260f3e2bc9" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=cccec67cfb8155b62fbd9064b5fd161612d7a40d0d6bccbf8bda1bbbb638ecac;K8S_POD_UID=829e2047-17e6-49ec-9baf-1339c0f5aea6" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/829e2047-17e6-49ec-9baf-1339c0f5aea6]: expected pod UID "829e2047-17e6-49ec-9baf-1339c0f5aea6" but got "52b9f7e1-d86c-457e-9391-eee855a9f7a7" from Kube API Mar 21 04:45:22 crc kubenswrapper[4839]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 21 04:45:22 crc kubenswrapper[4839]: > pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.371970 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtpr9\" (UniqueName: \"kubernetes.io/projected/52b9f7e1-d86c-457e-9391-eee855a9f7a7-kube-api-access-mtpr9\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.372041 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b9f7e1-d86c-457e-9391-eee855a9f7a7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.372092 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/52b9f7e1-d86c-457e-9391-eee855a9f7a7-openstack-config-secret\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.372185 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/52b9f7e1-d86c-457e-9391-eee855a9f7a7-openstack-config\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.373606 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/52b9f7e1-d86c-457e-9391-eee855a9f7a7-openstack-config\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.375974 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/52b9f7e1-d86c-457e-9391-eee855a9f7a7-openstack-config-secret\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.376042 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52b9f7e1-d86c-457e-9391-eee855a9f7a7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.392556 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtpr9\" (UniqueName: \"kubernetes.io/projected/52b9f7e1-d86c-457e-9391-eee855a9f7a7-kube-api-access-mtpr9\") pod \"openstackclient\" (UID: \"52b9f7e1-d86c-457e-9391-eee855a9f7a7\") " pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.469818 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82b135c8-5fc8-4930-9577-1dd9181a1dae" path="/var/lib/kubelet/pods/82b135c8-5fc8-4930-9577-1dd9181a1dae/volumes" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.535430 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.916851 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.918493 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"77964653-d242-4258-b06e-c9cd0fb64d84","Type":"ContainerStarted","Data":"0c7f33b3e0cfcb04e6e2b5015ac76fb269b31f642a83f153e7c4a18eadb3d006"} Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.952939 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 04:45:22 crc kubenswrapper[4839]: I0321 04:45:22.956113 4839 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="829e2047-17e6-49ec-9baf-1339c0f5aea6" podUID="52b9f7e1-d86c-457e-9391-eee855a9f7a7" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.026616 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.084141 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tlqc\" (UniqueName: \"kubernetes.io/projected/829e2047-17e6-49ec-9baf-1339c0f5aea6-kube-api-access-7tlqc\") pod \"829e2047-17e6-49ec-9baf-1339c0f5aea6\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.084250 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config-secret\") pod \"829e2047-17e6-49ec-9baf-1339c0f5aea6\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.084510 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-combined-ca-bundle\") pod \"829e2047-17e6-49ec-9baf-1339c0f5aea6\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.084660 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config\") pod \"829e2047-17e6-49ec-9baf-1339c0f5aea6\" (UID: \"829e2047-17e6-49ec-9baf-1339c0f5aea6\") " Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.085298 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "829e2047-17e6-49ec-9baf-1339c0f5aea6" (UID: "829e2047-17e6-49ec-9baf-1339c0f5aea6"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.092707 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/829e2047-17e6-49ec-9baf-1339c0f5aea6-kube-api-access-7tlqc" (OuterVolumeSpecName: "kube-api-access-7tlqc") pod "829e2047-17e6-49ec-9baf-1339c0f5aea6" (UID: "829e2047-17e6-49ec-9baf-1339c0f5aea6"). InnerVolumeSpecName "kube-api-access-7tlqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.092881 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "829e2047-17e6-49ec-9baf-1339c0f5aea6" (UID: "829e2047-17e6-49ec-9baf-1339c0f5aea6"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.093860 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "829e2047-17e6-49ec-9baf-1339c0f5aea6" (UID: "829e2047-17e6-49ec-9baf-1339c0f5aea6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.186484 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.186521 4839 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.186533 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tlqc\" (UniqueName: \"kubernetes.io/projected/829e2047-17e6-49ec-9baf-1339c0f5aea6-kube-api-access-7tlqc\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.186547 4839 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/829e2047-17e6-49ec-9baf-1339c0f5aea6-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.686349 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.932687 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"52b9f7e1-d86c-457e-9391-eee855a9f7a7","Type":"ContainerStarted","Data":"c327f228ec049650eba893fbc6b85e088c562445e9ab6b437e18033dbe633ec6"} Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.938517 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.940991 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"77964653-d242-4258-b06e-c9cd0fb64d84","Type":"ContainerStarted","Data":"bfa5e215a8064142662175001ba32c6559c68245a861ff8dad6895a48d51f16f"} Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.966064 4839 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="829e2047-17e6-49ec-9baf-1339c0f5aea6" podUID="52b9f7e1-d86c-457e-9391-eee855a9f7a7" Mar 21 04:45:23 crc kubenswrapper[4839]: I0321 04:45:23.969497 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.969477808 podStartE2EDuration="3.969477808s" podCreationTimestamp="2026-03-21 04:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:23.963510701 +0000 UTC m=+1328.291297377" watchObservedRunningTime="2026-03-21 04:45:23.969477808 +0000 UTC m=+1328.297264484" Mar 21 04:45:24 crc kubenswrapper[4839]: I0321 04:45:24.477157 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="829e2047-17e6-49ec-9baf-1339c0f5aea6" path="/var/lib/kubelet/pods/829e2047-17e6-49ec-9baf-1339c0f5aea6/volumes" Mar 21 04:45:26 crc kubenswrapper[4839]: I0321 04:45:26.259831 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.467345 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-b66c6bfff-76gfx"] Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.469896 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.476167 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.476310 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.476357 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.510293 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-b66c6bfff-76gfx"] Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.565419 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-log-httpd\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.565463 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-run-httpd\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.565487 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-config-data\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.565512 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-internal-tls-certs\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.565562 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-combined-ca-bundle\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.565607 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-etc-swift\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.565630 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-public-tls-certs\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.565715 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfhdt\" (UniqueName: \"kubernetes.io/projected/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-kube-api-access-zfhdt\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.667901 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-etc-swift\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.667968 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-public-tls-certs\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.668077 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfhdt\" (UniqueName: \"kubernetes.io/projected/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-kube-api-access-zfhdt\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.668186 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-log-httpd\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.668212 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-run-httpd\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.668239 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-config-data\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.668272 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-internal-tls-certs\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.668331 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-combined-ca-bundle\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.668892 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-log-httpd\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.668973 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-run-httpd\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.674893 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-public-tls-certs\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.675217 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-config-data\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.676583 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-combined-ca-bundle\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.680300 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-internal-tls-certs\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.681833 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-etc-swift\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.685435 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfhdt\" (UniqueName: \"kubernetes.io/projected/1af5fd5b-8392-4e55-b3fb-fdc9285dd135-kube-api-access-zfhdt\") pod \"swift-proxy-b66c6bfff-76gfx\" (UID: \"1af5fd5b-8392-4e55-b3fb-fdc9285dd135\") " pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:27 crc kubenswrapper[4839]: I0321 04:45:27.792201 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:29 crc kubenswrapper[4839]: I0321 04:45:29.262436 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84c6c985f8-v5cmh" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 21 04:45:30 crc kubenswrapper[4839]: I0321 04:45:30.347099 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:30 crc kubenswrapper[4839]: I0321 04:45:30.347456 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="ceilometer-central-agent" containerID="cri-o://c93a95cdff62e2e7b6e4859da043de79cf7d681e70275fda634ef213fa4a479c" gracePeriod=30 Mar 21 04:45:30 crc kubenswrapper[4839]: I0321 04:45:30.347554 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="proxy-httpd" containerID="cri-o://4474b8a6d0b40b2af1d9b2dc57ca277c0c905892995cd020658c486bec967d7a" gracePeriod=30 Mar 21 04:45:30 crc kubenswrapper[4839]: I0321 04:45:30.347615 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="ceilometer-notification-agent" containerID="cri-o://535723ef61f58618feb7059f038c3fae7ab1b8f214f13d762b44e16ad7930481" gracePeriod=30 Mar 21 04:45:30 crc kubenswrapper[4839]: I0321 04:45:30.347856 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="sg-core" containerID="cri-o://c23f136b850cb236ed5c6370a36a398b1c3d6f65a75d6f5afc35d4214b4f0367" gracePeriod=30 Mar 21 04:45:30 crc kubenswrapper[4839]: I0321 04:45:30.362647 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.177:3000/\": EOF" Mar 21 04:45:30 crc kubenswrapper[4839]: I0321 04:45:30.981045 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:45:30 crc kubenswrapper[4839]: I0321 04:45:30.981128 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:45:31 crc kubenswrapper[4839]: I0321 04:45:31.011206 4839 generic.go:334] "Generic (PLEG): container finished" podID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerID="4474b8a6d0b40b2af1d9b2dc57ca277c0c905892995cd020658c486bec967d7a" exitCode=0 Mar 21 04:45:31 crc kubenswrapper[4839]: I0321 04:45:31.011243 4839 generic.go:334] "Generic (PLEG): container finished" podID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerID="c23f136b850cb236ed5c6370a36a398b1c3d6f65a75d6f5afc35d4214b4f0367" exitCode=2 Mar 21 04:45:31 crc kubenswrapper[4839]: I0321 04:45:31.011253 4839 generic.go:334] "Generic (PLEG): container finished" podID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerID="535723ef61f58618feb7059f038c3fae7ab1b8f214f13d762b44e16ad7930481" exitCode=0 Mar 21 04:45:31 crc kubenswrapper[4839]: I0321 04:45:31.011262 4839 generic.go:334] "Generic (PLEG): container finished" podID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerID="c93a95cdff62e2e7b6e4859da043de79cf7d681e70275fda634ef213fa4a479c" exitCode=0 Mar 21 04:45:31 crc kubenswrapper[4839]: I0321 04:45:31.011286 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9","Type":"ContainerDied","Data":"4474b8a6d0b40b2af1d9b2dc57ca277c0c905892995cd020658c486bec967d7a"} Mar 21 04:45:31 crc kubenswrapper[4839]: I0321 04:45:31.011317 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9","Type":"ContainerDied","Data":"c23f136b850cb236ed5c6370a36a398b1c3d6f65a75d6f5afc35d4214b4f0367"} Mar 21 04:45:31 crc kubenswrapper[4839]: I0321 04:45:31.011331 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9","Type":"ContainerDied","Data":"535723ef61f58618feb7059f038c3fae7ab1b8f214f13d762b44e16ad7930481"} Mar 21 04:45:31 crc kubenswrapper[4839]: I0321 04:45:31.011345 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9","Type":"ContainerDied","Data":"c93a95cdff62e2e7b6e4859da043de79cf7d681e70275fda634ef213fa4a479c"} Mar 21 04:45:31 crc kubenswrapper[4839]: I0321 04:45:31.488939 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.746646 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.810035 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-sg-core-conf-yaml\") pod \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.810099 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmhp2\" (UniqueName: \"kubernetes.io/projected/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-kube-api-access-vmhp2\") pod \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.810143 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-config-data\") pod \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.810241 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-run-httpd\") pod \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.810336 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-scripts\") pod \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.810419 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-log-httpd\") pod \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.810463 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-combined-ca-bundle\") pod \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\" (UID: \"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9\") " Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.811037 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" (UID: "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.811214 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" (UID: "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.816503 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-scripts" (OuterVolumeSpecName: "scripts") pod "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" (UID: "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.816630 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-kube-api-access-vmhp2" (OuterVolumeSpecName: "kube-api-access-vmhp2") pod "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" (UID: "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9"). InnerVolumeSpecName "kube-api-access-vmhp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.837049 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" (UID: "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.885682 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" (UID: "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.902919 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-config-data" (OuterVolumeSpecName: "config-data") pod "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" (UID: "b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.913082 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.913114 4839 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.913124 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.913137 4839 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.913146 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmhp2\" (UniqueName: \"kubernetes.io/projected/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-kube-api-access-vmhp2\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.913156 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:32 crc kubenswrapper[4839]: I0321 04:45:32.913163 4839 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.028800 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"52b9f7e1-d86c-457e-9391-eee855a9f7a7","Type":"ContainerStarted","Data":"a5dd87302be70fb0a0dce91612650d2cd8feb29133bfdeebdfad1374e6c9d593"} Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.035521 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9","Type":"ContainerDied","Data":"c2ee73cd19c0a119a6c2d0a87e1fbe0c6315a3f99b97a20c1aa91548c3c2c2dc"} Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.035601 4839 scope.go:117] "RemoveContainer" containerID="4474b8a6d0b40b2af1d9b2dc57ca277c0c905892995cd020658c486bec967d7a" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.035600 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.054469 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.608250314 podStartE2EDuration="11.053934919s" podCreationTimestamp="2026-03-21 04:45:22 +0000 UTC" firstStartedPulling="2026-03-21 04:45:23.03368175 +0000 UTC m=+1327.361468426" lastFinishedPulling="2026-03-21 04:45:32.479366365 +0000 UTC m=+1336.807153031" observedRunningTime="2026-03-21 04:45:33.047716245 +0000 UTC m=+1337.375502921" watchObservedRunningTime="2026-03-21 04:45:33.053934919 +0000 UTC m=+1337.381721595" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.067933 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-b66c6bfff-76gfx"] Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.073476 4839 scope.go:117] "RemoveContainer" containerID="c23f136b850cb236ed5c6370a36a398b1c3d6f65a75d6f5afc35d4214b4f0367" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.085736 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.101558 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.104905 4839 scope.go:117] "RemoveContainer" containerID="535723ef61f58618feb7059f038c3fae7ab1b8f214f13d762b44e16ad7930481" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.113779 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:33 crc kubenswrapper[4839]: E0321 04:45:33.114411 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="proxy-httpd" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.114519 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="proxy-httpd" Mar 21 04:45:33 crc kubenswrapper[4839]: E0321 04:45:33.114558 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="ceilometer-central-agent" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.114582 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="ceilometer-central-agent" Mar 21 04:45:33 crc kubenswrapper[4839]: E0321 04:45:33.114606 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="ceilometer-notification-agent" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.114614 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="ceilometer-notification-agent" Mar 21 04:45:33 crc kubenswrapper[4839]: E0321 04:45:33.114626 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="sg-core" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.114634 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="sg-core" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.114970 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="proxy-httpd" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.115236 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="sg-core" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.115257 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="ceilometer-central-agent" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.115279 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" containerName="ceilometer-notification-agent" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.117519 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.120201 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.120476 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.132121 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.144766 4839 scope.go:117] "RemoveContainer" containerID="c93a95cdff62e2e7b6e4859da043de79cf7d681e70275fda634ef213fa4a479c" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.218741 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksth5\" (UniqueName: \"kubernetes.io/projected/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-kube-api-access-ksth5\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.218784 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.218835 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-scripts\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.218849 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-config-data\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.218899 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-log-httpd\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.218935 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.218997 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-run-httpd\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.320104 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.320201 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-run-httpd\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.320237 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksth5\" (UniqueName: \"kubernetes.io/projected/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-kube-api-access-ksth5\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.320259 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.320302 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-scripts\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.320408 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-config-data\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.320465 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-log-httpd\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.320918 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-log-httpd\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.320995 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-run-httpd\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.324212 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-config-data\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.324588 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.324954 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-scripts\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.324980 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.341035 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksth5\" (UniqueName: \"kubernetes.io/projected/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-kube-api-access-ksth5\") pod \"ceilometer-0\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.462193 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:33 crc kubenswrapper[4839]: I0321 04:45:33.909938 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:33 crc kubenswrapper[4839]: W0321 04:45:33.915785 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78141ffe_ee2c_4b66_b8e9_7224e526dfa2.slice/crio-826cbd06b784209787d8818bee60d8b520f17d5ca2e06bfab9213c157d0fdf3a WatchSource:0}: Error finding container 826cbd06b784209787d8818bee60d8b520f17d5ca2e06bfab9213c157d0fdf3a: Status 404 returned error can't find the container with id 826cbd06b784209787d8818bee60d8b520f17d5ca2e06bfab9213c157d0fdf3a Mar 21 04:45:34 crc kubenswrapper[4839]: I0321 04:45:34.047156 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78141ffe-ee2c-4b66-b8e9-7224e526dfa2","Type":"ContainerStarted","Data":"826cbd06b784209787d8818bee60d8b520f17d5ca2e06bfab9213c157d0fdf3a"} Mar 21 04:45:34 crc kubenswrapper[4839]: I0321 04:45:34.049302 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-b66c6bfff-76gfx" event={"ID":"1af5fd5b-8392-4e55-b3fb-fdc9285dd135","Type":"ContainerStarted","Data":"ee99a1a569d8b7a699f9dad6be9bed8c1d6fcb40d8143ea287a9d977a097eee4"} Mar 21 04:45:34 crc kubenswrapper[4839]: I0321 04:45:34.049334 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-b66c6bfff-76gfx" event={"ID":"1af5fd5b-8392-4e55-b3fb-fdc9285dd135","Type":"ContainerStarted","Data":"e9bcd08231d23ab63a4055b1e20413d6c3cbe21af71de74483e4551996a1b55f"} Mar 21 04:45:34 crc kubenswrapper[4839]: I0321 04:45:34.049349 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-b66c6bfff-76gfx" event={"ID":"1af5fd5b-8392-4e55-b3fb-fdc9285dd135","Type":"ContainerStarted","Data":"b6bd6f4fe36bb8f41cb753bbbf61d47676d982d86187d83427cd05de9a878678"} Mar 21 04:45:34 crc kubenswrapper[4839]: I0321 04:45:34.465036 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9" path="/var/lib/kubelet/pods/b4b205eb-a84a-4c2f-8b49-068d4e0a8ec9/volumes" Mar 21 04:45:34 crc kubenswrapper[4839]: I0321 04:45:34.833604 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-748dbf85fc-jslwv" Mar 21 04:45:34 crc kubenswrapper[4839]: I0321 04:45:34.954916 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d447b4d96-qkb69"] Mar 21 04:45:34 crc kubenswrapper[4839]: I0321 04:45:34.956777 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-d447b4d96-qkb69" podUID="12b60d89-b044-4822-bc95-47567123e883" containerName="neutron-api" containerID="cri-o://67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90" gracePeriod=30 Mar 21 04:45:34 crc kubenswrapper[4839]: I0321 04:45:34.956970 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-d447b4d96-qkb69" podUID="12b60d89-b044-4822-bc95-47567123e883" containerName="neutron-httpd" containerID="cri-o://b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc" gracePeriod=30 Mar 21 04:45:35 crc kubenswrapper[4839]: I0321 04:45:35.089699 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:35 crc kubenswrapper[4839]: I0321 04:45:35.089746 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:35 crc kubenswrapper[4839]: I0321 04:45:35.150417 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-b66c6bfff-76gfx" podStartSLOduration=8.150393858 podStartE2EDuration="8.150393858s" podCreationTimestamp="2026-03-21 04:45:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:35.124989117 +0000 UTC m=+1339.452775813" watchObservedRunningTime="2026-03-21 04:45:35.150393858 +0000 UTC m=+1339.478180534" Mar 21 04:45:36 crc kubenswrapper[4839]: I0321 04:45:36.099592 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78141ffe-ee2c-4b66-b8e9-7224e526dfa2","Type":"ContainerStarted","Data":"b9ba563d5bd44575dcd70c87b15efdd8ce73c0f0033e02dc5b2a64d859412a76"} Mar 21 04:45:36 crc kubenswrapper[4839]: I0321 04:45:36.101780 4839 generic.go:334] "Generic (PLEG): container finished" podID="12b60d89-b044-4822-bc95-47567123e883" containerID="b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc" exitCode=0 Mar 21 04:45:36 crc kubenswrapper[4839]: I0321 04:45:36.101840 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d447b4d96-qkb69" event={"ID":"12b60d89-b044-4822-bc95-47567123e883","Type":"ContainerDied","Data":"b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc"} Mar 21 04:45:36 crc kubenswrapper[4839]: I0321 04:45:36.393410 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:37 crc kubenswrapper[4839]: I0321 04:45:37.113782 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78141ffe-ee2c-4b66-b8e9-7224e526dfa2","Type":"ContainerStarted","Data":"7ad0bd3925864d1f7ff2de141b61807a3f02be0a2c40d1fe27bf1e9b2163e79a"} Mar 21 04:45:37 crc kubenswrapper[4839]: I0321 04:45:37.114096 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78141ffe-ee2c-4b66-b8e9-7224e526dfa2","Type":"ContainerStarted","Data":"0f08d30af95cdb5d8f671e4eec16f22bf5a749117dd6ef17a7712750d567ba4f"} Mar 21 04:45:39 crc kubenswrapper[4839]: I0321 04:45:39.134885 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78141ffe-ee2c-4b66-b8e9-7224e526dfa2","Type":"ContainerStarted","Data":"b3eab21d6ba6a94a0f66e87839a22f19a3c9384b821721458022b1cab04ab92a"} Mar 21 04:45:39 crc kubenswrapper[4839]: I0321 04:45:39.135155 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="ceilometer-central-agent" containerID="cri-o://b9ba563d5bd44575dcd70c87b15efdd8ce73c0f0033e02dc5b2a64d859412a76" gracePeriod=30 Mar 21 04:45:39 crc kubenswrapper[4839]: I0321 04:45:39.135391 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="proxy-httpd" containerID="cri-o://b3eab21d6ba6a94a0f66e87839a22f19a3c9384b821721458022b1cab04ab92a" gracePeriod=30 Mar 21 04:45:39 crc kubenswrapper[4839]: I0321 04:45:39.135467 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="sg-core" containerID="cri-o://7ad0bd3925864d1f7ff2de141b61807a3f02be0a2c40d1fe27bf1e9b2163e79a" gracePeriod=30 Mar 21 04:45:39 crc kubenswrapper[4839]: I0321 04:45:39.135424 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="ceilometer-notification-agent" containerID="cri-o://0f08d30af95cdb5d8f671e4eec16f22bf5a749117dd6ef17a7712750d567ba4f" gracePeriod=30 Mar 21 04:45:39 crc kubenswrapper[4839]: I0321 04:45:39.135620 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 04:45:39 crc kubenswrapper[4839]: I0321 04:45:39.169318 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.382260259 podStartE2EDuration="6.169297628s" podCreationTimestamp="2026-03-21 04:45:33 +0000 UTC" firstStartedPulling="2026-03-21 04:45:33.920056298 +0000 UTC m=+1338.247842974" lastFinishedPulling="2026-03-21 04:45:38.707093667 +0000 UTC m=+1343.034880343" observedRunningTime="2026-03-21 04:45:39.159933186 +0000 UTC m=+1343.487719862" watchObservedRunningTime="2026-03-21 04:45:39.169297628 +0000 UTC m=+1343.497084304" Mar 21 04:45:39 crc kubenswrapper[4839]: I0321 04:45:39.261843 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84c6c985f8-v5cmh" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.065092 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.146636 4839 generic.go:334] "Generic (PLEG): container finished" podID="12b60d89-b044-4822-bc95-47567123e883" containerID="67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90" exitCode=0 Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.146689 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d447b4d96-qkb69" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.146689 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d447b4d96-qkb69" event={"ID":"12b60d89-b044-4822-bc95-47567123e883","Type":"ContainerDied","Data":"67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90"} Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.146748 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d447b4d96-qkb69" event={"ID":"12b60d89-b044-4822-bc95-47567123e883","Type":"ContainerDied","Data":"e44c81ce53fb7cb7cb67615e87aced0fc7bd4c886cbd53ea268fc23a5209a592"} Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.146769 4839 scope.go:117] "RemoveContainer" containerID="b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.151066 4839 generic.go:334] "Generic (PLEG): container finished" podID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerID="b3eab21d6ba6a94a0f66e87839a22f19a3c9384b821721458022b1cab04ab92a" exitCode=0 Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.151099 4839 generic.go:334] "Generic (PLEG): container finished" podID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerID="7ad0bd3925864d1f7ff2de141b61807a3f02be0a2c40d1fe27bf1e9b2163e79a" exitCode=2 Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.151108 4839 generic.go:334] "Generic (PLEG): container finished" podID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerID="0f08d30af95cdb5d8f671e4eec16f22bf5a749117dd6ef17a7712750d567ba4f" exitCode=0 Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.151126 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78141ffe-ee2c-4b66-b8e9-7224e526dfa2","Type":"ContainerDied","Data":"b3eab21d6ba6a94a0f66e87839a22f19a3c9384b821721458022b1cab04ab92a"} Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.151149 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78141ffe-ee2c-4b66-b8e9-7224e526dfa2","Type":"ContainerDied","Data":"7ad0bd3925864d1f7ff2de141b61807a3f02be0a2c40d1fe27bf1e9b2163e79a"} Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.151158 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78141ffe-ee2c-4b66-b8e9-7224e526dfa2","Type":"ContainerDied","Data":"0f08d30af95cdb5d8f671e4eec16f22bf5a749117dd6ef17a7712750d567ba4f"} Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.166791 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swz79\" (UniqueName: \"kubernetes.io/projected/12b60d89-b044-4822-bc95-47567123e883-kube-api-access-swz79\") pod \"12b60d89-b044-4822-bc95-47567123e883\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.166867 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-httpd-config\") pod \"12b60d89-b044-4822-bc95-47567123e883\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.167018 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-config\") pod \"12b60d89-b044-4822-bc95-47567123e883\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.167042 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-ovndb-tls-certs\") pod \"12b60d89-b044-4822-bc95-47567123e883\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.167251 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-combined-ca-bundle\") pod \"12b60d89-b044-4822-bc95-47567123e883\" (UID: \"12b60d89-b044-4822-bc95-47567123e883\") " Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.168914 4839 scope.go:117] "RemoveContainer" containerID="67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.173801 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "12b60d89-b044-4822-bc95-47567123e883" (UID: "12b60d89-b044-4822-bc95-47567123e883"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.174322 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12b60d89-b044-4822-bc95-47567123e883-kube-api-access-swz79" (OuterVolumeSpecName: "kube-api-access-swz79") pod "12b60d89-b044-4822-bc95-47567123e883" (UID: "12b60d89-b044-4822-bc95-47567123e883"). InnerVolumeSpecName "kube-api-access-swz79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.227589 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-config" (OuterVolumeSpecName: "config") pod "12b60d89-b044-4822-bc95-47567123e883" (UID: "12b60d89-b044-4822-bc95-47567123e883"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.244244 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12b60d89-b044-4822-bc95-47567123e883" (UID: "12b60d89-b044-4822-bc95-47567123e883"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.257859 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "12b60d89-b044-4822-bc95-47567123e883" (UID: "12b60d89-b044-4822-bc95-47567123e883"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.269753 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.269782 4839 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.269796 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.269805 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swz79\" (UniqueName: \"kubernetes.io/projected/12b60d89-b044-4822-bc95-47567123e883-kube-api-access-swz79\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.269814 4839 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/12b60d89-b044-4822-bc95-47567123e883-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.311283 4839 scope.go:117] "RemoveContainer" containerID="b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc" Mar 21 04:45:40 crc kubenswrapper[4839]: E0321 04:45:40.311880 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc\": container with ID starting with b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc not found: ID does not exist" containerID="b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.311929 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc"} err="failed to get container status \"b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc\": rpc error: code = NotFound desc = could not find container \"b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc\": container with ID starting with b34d8d2f70a5b785c9d68f7ee3d5caf1ba929e7c2f1691a3903d1460eb0bcbbc not found: ID does not exist" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.311995 4839 scope.go:117] "RemoveContainer" containerID="67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90" Mar 21 04:45:40 crc kubenswrapper[4839]: E0321 04:45:40.312430 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90\": container with ID starting with 67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90 not found: ID does not exist" containerID="67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.312462 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90"} err="failed to get container status \"67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90\": rpc error: code = NotFound desc = could not find container \"67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90\": container with ID starting with 67565e75899a6f37a50691495a20ce72ac05f30c786d835eec041d52489fae90 not found: ID does not exist" Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.495415 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d447b4d96-qkb69"] Mar 21 04:45:40 crc kubenswrapper[4839]: I0321 04:45:40.506329 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-d447b4d96-qkb69"] Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.162737 4839 generic.go:334] "Generic (PLEG): container finished" podID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerID="b9ba563d5bd44575dcd70c87b15efdd8ce73c0f0033e02dc5b2a64d859412a76" exitCode=0 Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.162810 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78141ffe-ee2c-4b66-b8e9-7224e526dfa2","Type":"ContainerDied","Data":"b9ba563d5bd44575dcd70c87b15efdd8ce73c0f0033e02dc5b2a64d859412a76"} Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.243184 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.323511 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-log-httpd\") pod \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.323703 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksth5\" (UniqueName: \"kubernetes.io/projected/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-kube-api-access-ksth5\") pod \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.323748 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-combined-ca-bundle\") pod \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.323788 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-run-httpd\") pod \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.323805 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-sg-core-conf-yaml\") pod \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.323840 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-scripts\") pod \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.323911 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-config-data\") pod \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\" (UID: \"78141ffe-ee2c-4b66-b8e9-7224e526dfa2\") " Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.324390 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "78141ffe-ee2c-4b66-b8e9-7224e526dfa2" (UID: "78141ffe-ee2c-4b66-b8e9-7224e526dfa2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.324623 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "78141ffe-ee2c-4b66-b8e9-7224e526dfa2" (UID: "78141ffe-ee2c-4b66-b8e9-7224e526dfa2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.332898 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-kube-api-access-ksth5" (OuterVolumeSpecName: "kube-api-access-ksth5") pod "78141ffe-ee2c-4b66-b8e9-7224e526dfa2" (UID: "78141ffe-ee2c-4b66-b8e9-7224e526dfa2"). InnerVolumeSpecName "kube-api-access-ksth5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.334796 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-scripts" (OuterVolumeSpecName: "scripts") pod "78141ffe-ee2c-4b66-b8e9-7224e526dfa2" (UID: "78141ffe-ee2c-4b66-b8e9-7224e526dfa2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.359757 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "78141ffe-ee2c-4b66-b8e9-7224e526dfa2" (UID: "78141ffe-ee2c-4b66-b8e9-7224e526dfa2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.426284 4839 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.426685 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksth5\" (UniqueName: \"kubernetes.io/projected/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-kube-api-access-ksth5\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.426700 4839 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.426711 4839 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.426723 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432038 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-4zz89"] Mar 21 04:45:41 crc kubenswrapper[4839]: E0321 04:45:41.432439 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b60d89-b044-4822-bc95-47567123e883" containerName="neutron-api" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432457 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b60d89-b044-4822-bc95-47567123e883" containerName="neutron-api" Mar 21 04:45:41 crc kubenswrapper[4839]: E0321 04:45:41.432481 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12b60d89-b044-4822-bc95-47567123e883" containerName="neutron-httpd" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432490 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="12b60d89-b044-4822-bc95-47567123e883" containerName="neutron-httpd" Mar 21 04:45:41 crc kubenswrapper[4839]: E0321 04:45:41.432501 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="ceilometer-central-agent" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432547 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="ceilometer-central-agent" Mar 21 04:45:41 crc kubenswrapper[4839]: E0321 04:45:41.432583 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="ceilometer-notification-agent" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432591 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="ceilometer-notification-agent" Mar 21 04:45:41 crc kubenswrapper[4839]: E0321 04:45:41.432606 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="sg-core" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432615 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="sg-core" Mar 21 04:45:41 crc kubenswrapper[4839]: E0321 04:45:41.432631 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="proxy-httpd" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432638 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="proxy-httpd" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432791 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="sg-core" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432800 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="12b60d89-b044-4822-bc95-47567123e883" containerName="neutron-httpd" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432816 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="proxy-httpd" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432829 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="ceilometer-central-agent" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432843 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" containerName="ceilometer-notification-agent" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.432853 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="12b60d89-b044-4822-bc95-47567123e883" containerName="neutron-api" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.433370 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4zz89" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.449371 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4zz89"] Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.469830 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-config-data" (OuterVolumeSpecName: "config-data") pod "78141ffe-ee2c-4b66-b8e9-7224e526dfa2" (UID: "78141ffe-ee2c-4b66-b8e9-7224e526dfa2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.470944 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78141ffe-ee2c-4b66-b8e9-7224e526dfa2" (UID: "78141ffe-ee2c-4b66-b8e9-7224e526dfa2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.503914 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-ds7tq"] Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.504959 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ds7tq" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.522768 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-ds7tq"] Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.527940 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b76e9253-1495-42d5-910f-cce6f2730243-operator-scripts\") pod \"nova-api-db-create-4zz89\" (UID: \"b76e9253-1495-42d5-910f-cce6f2730243\") " pod="openstack/nova-api-db-create-4zz89" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.528250 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-978bj\" (UniqueName: \"kubernetes.io/projected/b76e9253-1495-42d5-910f-cce6f2730243-kube-api-access-978bj\") pod \"nova-api-db-create-4zz89\" (UID: \"b76e9253-1495-42d5-910f-cce6f2730243\") " pod="openstack/nova-api-db-create-4zz89" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.529160 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.529667 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78141ffe-ee2c-4b66-b8e9-7224e526dfa2-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.617797 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-w9wx6"] Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.619290 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w9wx6" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.630597 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-48d5-account-create-update-5k79b"] Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.631828 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-48d5-account-create-update-5k79b" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.634374 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b76e9253-1495-42d5-910f-cce6f2730243-operator-scripts\") pod \"nova-api-db-create-4zz89\" (UID: \"b76e9253-1495-42d5-910f-cce6f2730243\") " pod="openstack/nova-api-db-create-4zz89" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.634465 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-operator-scripts\") pod \"nova-cell0-db-create-ds7tq\" (UID: \"9220ed3c-2e97-4efc-a4cc-28bb29774ad8\") " pod="openstack/nova-cell0-db-create-ds7tq" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.634515 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb2gw\" (UniqueName: \"kubernetes.io/projected/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-kube-api-access-mb2gw\") pod \"nova-cell0-db-create-ds7tq\" (UID: \"9220ed3c-2e97-4efc-a4cc-28bb29774ad8\") " pod="openstack/nova-cell0-db-create-ds7tq" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.634606 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-978bj\" (UniqueName: \"kubernetes.io/projected/b76e9253-1495-42d5-910f-cce6f2730243-kube-api-access-978bj\") pod \"nova-api-db-create-4zz89\" (UID: \"b76e9253-1495-42d5-910f-cce6f2730243\") " pod="openstack/nova-api-db-create-4zz89" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.635183 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b76e9253-1495-42d5-910f-cce6f2730243-operator-scripts\") pod \"nova-api-db-create-4zz89\" (UID: \"b76e9253-1495-42d5-910f-cce6f2730243\") " pod="openstack/nova-api-db-create-4zz89" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.636328 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.645994 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-w9wx6"] Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.663099 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-978bj\" (UniqueName: \"kubernetes.io/projected/b76e9253-1495-42d5-910f-cce6f2730243-kube-api-access-978bj\") pod \"nova-api-db-create-4zz89\" (UID: \"b76e9253-1495-42d5-910f-cce6f2730243\") " pod="openstack/nova-api-db-create-4zz89" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.674472 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-48d5-account-create-update-5k79b"] Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.736646 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c56098-2959-4bd0-b762-36a4ee1bb2e6-operator-scripts\") pod \"nova-cell1-db-create-w9wx6\" (UID: \"46c56098-2959-4bd0-b762-36a4ee1bb2e6\") " pod="openstack/nova-cell1-db-create-w9wx6" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.736727 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpg9b\" (UniqueName: \"kubernetes.io/projected/f481fb0d-ac2f-4989-a547-50f5081e4e78-kube-api-access-gpg9b\") pod \"nova-api-48d5-account-create-update-5k79b\" (UID: \"f481fb0d-ac2f-4989-a547-50f5081e4e78\") " pod="openstack/nova-api-48d5-account-create-update-5k79b" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.736833 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-operator-scripts\") pod \"nova-cell0-db-create-ds7tq\" (UID: \"9220ed3c-2e97-4efc-a4cc-28bb29774ad8\") " pod="openstack/nova-cell0-db-create-ds7tq" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.736880 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb2gw\" (UniqueName: \"kubernetes.io/projected/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-kube-api-access-mb2gw\") pod \"nova-cell0-db-create-ds7tq\" (UID: \"9220ed3c-2e97-4efc-a4cc-28bb29774ad8\") " pod="openstack/nova-cell0-db-create-ds7tq" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.736910 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhvhx\" (UniqueName: \"kubernetes.io/projected/46c56098-2959-4bd0-b762-36a4ee1bb2e6-kube-api-access-bhvhx\") pod \"nova-cell1-db-create-w9wx6\" (UID: \"46c56098-2959-4bd0-b762-36a4ee1bb2e6\") " pod="openstack/nova-cell1-db-create-w9wx6" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.736937 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f481fb0d-ac2f-4989-a547-50f5081e4e78-operator-scripts\") pod \"nova-api-48d5-account-create-update-5k79b\" (UID: \"f481fb0d-ac2f-4989-a547-50f5081e4e78\") " pod="openstack/nova-api-48d5-account-create-update-5k79b" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.737779 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-operator-scripts\") pod \"nova-cell0-db-create-ds7tq\" (UID: \"9220ed3c-2e97-4efc-a4cc-28bb29774ad8\") " pod="openstack/nova-cell0-db-create-ds7tq" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.750937 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4zz89" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.759641 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb2gw\" (UniqueName: \"kubernetes.io/projected/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-kube-api-access-mb2gw\") pod \"nova-cell0-db-create-ds7tq\" (UID: \"9220ed3c-2e97-4efc-a4cc-28bb29774ad8\") " pod="openstack/nova-cell0-db-create-ds7tq" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.827146 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ds7tq" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.829356 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-46c8-account-create-update-mp8jl"] Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.830705 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.833379 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.838405 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c56098-2959-4bd0-b762-36a4ee1bb2e6-operator-scripts\") pod \"nova-cell1-db-create-w9wx6\" (UID: \"46c56098-2959-4bd0-b762-36a4ee1bb2e6\") " pod="openstack/nova-cell1-db-create-w9wx6" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.838472 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpg9b\" (UniqueName: \"kubernetes.io/projected/f481fb0d-ac2f-4989-a547-50f5081e4e78-kube-api-access-gpg9b\") pod \"nova-api-48d5-account-create-update-5k79b\" (UID: \"f481fb0d-ac2f-4989-a547-50f5081e4e78\") " pod="openstack/nova-api-48d5-account-create-update-5k79b" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.838589 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhvhx\" (UniqueName: \"kubernetes.io/projected/46c56098-2959-4bd0-b762-36a4ee1bb2e6-kube-api-access-bhvhx\") pod \"nova-cell1-db-create-w9wx6\" (UID: \"46c56098-2959-4bd0-b762-36a4ee1bb2e6\") " pod="openstack/nova-cell1-db-create-w9wx6" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.838612 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f481fb0d-ac2f-4989-a547-50f5081e4e78-operator-scripts\") pod \"nova-api-48d5-account-create-update-5k79b\" (UID: \"f481fb0d-ac2f-4989-a547-50f5081e4e78\") " pod="openstack/nova-api-48d5-account-create-update-5k79b" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.839351 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f481fb0d-ac2f-4989-a547-50f5081e4e78-operator-scripts\") pod \"nova-api-48d5-account-create-update-5k79b\" (UID: \"f481fb0d-ac2f-4989-a547-50f5081e4e78\") " pod="openstack/nova-api-48d5-account-create-update-5k79b" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.842583 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c56098-2959-4bd0-b762-36a4ee1bb2e6-operator-scripts\") pod \"nova-cell1-db-create-w9wx6\" (UID: \"46c56098-2959-4bd0-b762-36a4ee1bb2e6\") " pod="openstack/nova-cell1-db-create-w9wx6" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.857674 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-46c8-account-create-update-mp8jl"] Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.864735 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhvhx\" (UniqueName: \"kubernetes.io/projected/46c56098-2959-4bd0-b762-36a4ee1bb2e6-kube-api-access-bhvhx\") pod \"nova-cell1-db-create-w9wx6\" (UID: \"46c56098-2959-4bd0-b762-36a4ee1bb2e6\") " pod="openstack/nova-cell1-db-create-w9wx6" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.865171 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpg9b\" (UniqueName: \"kubernetes.io/projected/f481fb0d-ac2f-4989-a547-50f5081e4e78-kube-api-access-gpg9b\") pod \"nova-api-48d5-account-create-update-5k79b\" (UID: \"f481fb0d-ac2f-4989-a547-50f5081e4e78\") " pod="openstack/nova-api-48d5-account-create-update-5k79b" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.933840 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w9wx6" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.940510 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4185a56e-9d10-4aea-ad84-a865dff3e6be-operator-scripts\") pod \"nova-cell0-46c8-account-create-update-mp8jl\" (UID: \"4185a56e-9d10-4aea-ad84-a865dff3e6be\") " pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.940802 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb7zf\" (UniqueName: \"kubernetes.io/projected/4185a56e-9d10-4aea-ad84-a865dff3e6be-kube-api-access-vb7zf\") pod \"nova-cell0-46c8-account-create-update-mp8jl\" (UID: \"4185a56e-9d10-4aea-ad84-a865dff3e6be\") " pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" Mar 21 04:45:41 crc kubenswrapper[4839]: I0321 04:45:41.945721 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-48d5-account-create-update-5k79b" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.041622 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-94b7-account-create-update-zmpzr"] Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.042807 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb7zf\" (UniqueName: \"kubernetes.io/projected/4185a56e-9d10-4aea-ad84-a865dff3e6be-kube-api-access-vb7zf\") pod \"nova-cell0-46c8-account-create-update-mp8jl\" (UID: \"4185a56e-9d10-4aea-ad84-a865dff3e6be\") " pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.042855 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4185a56e-9d10-4aea-ad84-a865dff3e6be-operator-scripts\") pod \"nova-cell0-46c8-account-create-update-mp8jl\" (UID: \"4185a56e-9d10-4aea-ad84-a865dff3e6be\") " pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.043062 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.045085 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4185a56e-9d10-4aea-ad84-a865dff3e6be-operator-scripts\") pod \"nova-cell0-46c8-account-create-update-mp8jl\" (UID: \"4185a56e-9d10-4aea-ad84-a865dff3e6be\") " pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.046105 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.058667 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-94b7-account-create-update-zmpzr"] Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.064906 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb7zf\" (UniqueName: \"kubernetes.io/projected/4185a56e-9d10-4aea-ad84-a865dff3e6be-kube-api-access-vb7zf\") pod \"nova-cell0-46c8-account-create-update-mp8jl\" (UID: \"4185a56e-9d10-4aea-ad84-a865dff3e6be\") " pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.144988 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60534a44-1538-4bdb-81d1-043c9ae84cee-operator-scripts\") pod \"nova-cell1-94b7-account-create-update-zmpzr\" (UID: \"60534a44-1538-4bdb-81d1-043c9ae84cee\") " pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.145293 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j98k2\" (UniqueName: \"kubernetes.io/projected/60534a44-1538-4bdb-81d1-043c9ae84cee-kube-api-access-j98k2\") pod \"nova-cell1-94b7-account-create-update-zmpzr\" (UID: \"60534a44-1538-4bdb-81d1-043c9ae84cee\") " pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.186338 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"78141ffe-ee2c-4b66-b8e9-7224e526dfa2","Type":"ContainerDied","Data":"826cbd06b784209787d8818bee60d8b520f17d5ca2e06bfab9213c157d0fdf3a"} Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.186400 4839 scope.go:117] "RemoveContainer" containerID="b3eab21d6ba6a94a0f66e87839a22f19a3c9384b821721458022b1cab04ab92a" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.186593 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.232959 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.241007 4839 scope.go:117] "RemoveContainer" containerID="7ad0bd3925864d1f7ff2de141b61807a3f02be0a2c40d1fe27bf1e9b2163e79a" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.246892 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60534a44-1538-4bdb-81d1-043c9ae84cee-operator-scripts\") pod \"nova-cell1-94b7-account-create-update-zmpzr\" (UID: \"60534a44-1538-4bdb-81d1-043c9ae84cee\") " pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.246963 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j98k2\" (UniqueName: \"kubernetes.io/projected/60534a44-1538-4bdb-81d1-043c9ae84cee-kube-api-access-j98k2\") pod \"nova-cell1-94b7-account-create-update-zmpzr\" (UID: \"60534a44-1538-4bdb-81d1-043c9ae84cee\") " pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.250930 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60534a44-1538-4bdb-81d1-043c9ae84cee-operator-scripts\") pod \"nova-cell1-94b7-account-create-update-zmpzr\" (UID: \"60534a44-1538-4bdb-81d1-043c9ae84cee\") " pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" Mar 21 04:45:42 crc kubenswrapper[4839]: W0321 04:45:42.252103 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb76e9253_1495_42d5_910f_cce6f2730243.slice/crio-54b14184b0a30e6f28cd2e9d592a640dccc616fa6f788aae4c5dcf3a458c8feb WatchSource:0}: Error finding container 54b14184b0a30e6f28cd2e9d592a640dccc616fa6f788aae4c5dcf3a458c8feb: Status 404 returned error can't find the container with id 54b14184b0a30e6f28cd2e9d592a640dccc616fa6f788aae4c5dcf3a458c8feb Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.261992 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.266353 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j98k2\" (UniqueName: \"kubernetes.io/projected/60534a44-1538-4bdb-81d1-043c9ae84cee-kube-api-access-j98k2\") pod \"nova-cell1-94b7-account-create-update-zmpzr\" (UID: \"60534a44-1538-4bdb-81d1-043c9ae84cee\") " pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.283850 4839 scope.go:117] "RemoveContainer" containerID="0f08d30af95cdb5d8f671e4eec16f22bf5a749117dd6ef17a7712750d567ba4f" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.284034 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4zz89"] Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.295828 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.298758 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.300963 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.301226 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.307138 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.307753 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.326554 4839 scope.go:117] "RemoveContainer" containerID="b9ba563d5bd44575dcd70c87b15efdd8ce73c0f0033e02dc5b2a64d859412a76" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.348451 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-run-httpd\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.348543 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldtbn\" (UniqueName: \"kubernetes.io/projected/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-kube-api-access-ldtbn\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.348631 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.348666 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-log-httpd\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.348698 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-scripts\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.349014 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-config-data\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.349169 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.382817 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.451430 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-scripts\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.451736 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-config-data\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.451837 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.451951 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-run-httpd\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.452065 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldtbn\" (UniqueName: \"kubernetes.io/projected/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-kube-api-access-ldtbn\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.452690 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.452788 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-log-httpd\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.452556 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-run-httpd\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.453232 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-log-httpd\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.458593 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-scripts\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.460726 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-config-data\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.461236 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.463629 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.474061 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12b60d89-b044-4822-bc95-47567123e883" path="/var/lib/kubelet/pods/12b60d89-b044-4822-bc95-47567123e883/volumes" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.474887 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78141ffe-ee2c-4b66-b8e9-7224e526dfa2" path="/var/lib/kubelet/pods/78141ffe-ee2c-4b66-b8e9-7224e526dfa2/volumes" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.477783 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldtbn\" (UniqueName: \"kubernetes.io/projected/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-kube-api-access-ldtbn\") pod \"ceilometer-0\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.487724 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-48d5-account-create-update-5k79b"] Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.510458 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-w9wx6"] Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.523315 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-ds7tq"] Mar 21 04:45:42 crc kubenswrapper[4839]: W0321 04:45:42.554912 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46c56098_2959_4bd0_b762_36a4ee1bb2e6.slice/crio-4cd877ec810dc6f7d6a39c46cd7ecf7300f180e282dae1509e0c792ab4b45fc8 WatchSource:0}: Error finding container 4cd877ec810dc6f7d6a39c46cd7ecf7300f180e282dae1509e0c792ab4b45fc8: Status 404 returned error can't find the container with id 4cd877ec810dc6f7d6a39c46cd7ecf7300f180e282dae1509e0c792ab4b45fc8 Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.628254 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.803764 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.826477 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-b66c6bfff-76gfx" Mar 21 04:45:42 crc kubenswrapper[4839]: I0321 04:45:42.923754 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-94b7-account-create-update-zmpzr"] Mar 21 04:45:42 crc kubenswrapper[4839]: W0321 04:45:42.927658 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60534a44_1538_4bdb_81d1_043c9ae84cee.slice/crio-df524f3b5015131b55e090a47dcfb3d8225d4911cc5b551f8673ad913f2f5471 WatchSource:0}: Error finding container df524f3b5015131b55e090a47dcfb3d8225d4911cc5b551f8673ad913f2f5471: Status 404 returned error can't find the container with id df524f3b5015131b55e090a47dcfb3d8225d4911cc5b551f8673ad913f2f5471 Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.000930 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-46c8-account-create-update-mp8jl"] Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.228832 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w9wx6" event={"ID":"46c56098-2959-4bd0-b762-36a4ee1bb2e6","Type":"ContainerStarted","Data":"ccd22af7723d538ca33a42ba3654ebdb55e8713c02134e6ab93cc893ad28c76a"} Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.228884 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w9wx6" event={"ID":"46c56098-2959-4bd0-b762-36a4ee1bb2e6","Type":"ContainerStarted","Data":"4cd877ec810dc6f7d6a39c46cd7ecf7300f180e282dae1509e0c792ab4b45fc8"} Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.234131 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4zz89" event={"ID":"b76e9253-1495-42d5-910f-cce6f2730243","Type":"ContainerStarted","Data":"d3d91629ebc8060afc821dc6f6ff1f1f4f9eb9613514c223b3a39c31ccd40e5c"} Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.234364 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4zz89" event={"ID":"b76e9253-1495-42d5-910f-cce6f2730243","Type":"ContainerStarted","Data":"54b14184b0a30e6f28cd2e9d592a640dccc616fa6f788aae4c5dcf3a458c8feb"} Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.238467 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.244500 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" event={"ID":"60534a44-1538-4bdb-81d1-043c9ae84cee","Type":"ContainerStarted","Data":"df524f3b5015131b55e090a47dcfb3d8225d4911cc5b551f8673ad913f2f5471"} Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.255220 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" event={"ID":"4185a56e-9d10-4aea-ad84-a865dff3e6be","Type":"ContainerStarted","Data":"5066435ed3d77bc5c33c59d562874afad187c31dc999c5e5a391a142f1d66cb0"} Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.261765 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-48d5-account-create-update-5k79b" event={"ID":"f481fb0d-ac2f-4989-a547-50f5081e4e78","Type":"ContainerStarted","Data":"9c0964d074027bd8b20f7561440904d50f0f5ad70eed7435ed8da532c09da947"} Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.261919 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-48d5-account-create-update-5k79b" event={"ID":"f481fb0d-ac2f-4989-a547-50f5081e4e78","Type":"ContainerStarted","Data":"f92cef9a4e1b4a3b36fa3f0703a08a139186e14f9ea165ca6acea88ecdb50732"} Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.279005 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-4zz89" podStartSLOduration=2.278970437 podStartE2EDuration="2.278970437s" podCreationTimestamp="2026-03-21 04:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:43.254316157 +0000 UTC m=+1347.582102843" watchObservedRunningTime="2026-03-21 04:45:43.278970437 +0000 UTC m=+1347.606757113" Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.300869 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-48d5-account-create-update-5k79b" podStartSLOduration=2.300843349 podStartE2EDuration="2.300843349s" podCreationTimestamp="2026-03-21 04:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:43.295057327 +0000 UTC m=+1347.622844003" watchObservedRunningTime="2026-03-21 04:45:43.300843349 +0000 UTC m=+1347.628630025" Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.300988 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ds7tq" event={"ID":"9220ed3c-2e97-4efc-a4cc-28bb29774ad8","Type":"ContainerStarted","Data":"296c6956b7a45c772d2bc75858a9b2db91782289c6cf30854b24fd106bb5d692"} Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.301049 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ds7tq" event={"ID":"9220ed3c-2e97-4efc-a4cc-28bb29774ad8","Type":"ContainerStarted","Data":"69b1f529d6acbca50b055efd49164192e5afe15ca2555525c0367f380a6d5b3e"} Mar 21 04:45:43 crc kubenswrapper[4839]: I0321 04:45:43.319734 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-ds7tq" podStartSLOduration=2.319717137 podStartE2EDuration="2.319717137s" podCreationTimestamp="2026-03-21 04:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:43.314205533 +0000 UTC m=+1347.641992209" watchObservedRunningTime="2026-03-21 04:45:43.319717137 +0000 UTC m=+1347.647503813" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.329174 4839 generic.go:334] "Generic (PLEG): container finished" podID="b76e9253-1495-42d5-910f-cce6f2730243" containerID="d3d91629ebc8060afc821dc6f6ff1f1f4f9eb9613514c223b3a39c31ccd40e5c" exitCode=0 Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.329486 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4zz89" event={"ID":"b76e9253-1495-42d5-910f-cce6f2730243","Type":"ContainerDied","Data":"d3d91629ebc8060afc821dc6f6ff1f1f4f9eb9613514c223b3a39c31ccd40e5c"} Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.333868 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df08a3cb-a9ae-4b8e-a9c8-604c41db5158","Type":"ContainerStarted","Data":"41ff38380ac8ed55675761ad2bd4b24ee85da709e085d038d76ac53207f2c9ae"} Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.353047 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" event={"ID":"60534a44-1538-4bdb-81d1-043c9ae84cee","Type":"ContainerStarted","Data":"2de908b5bd6bba55215cf326e7323c0123b89a96311bd62e86b355ee0ff19bc1"} Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.360078 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" event={"ID":"4185a56e-9d10-4aea-ad84-a865dff3e6be","Type":"ContainerStarted","Data":"c7f784ce54bb50fe64fb506149fb81059511360e35c18d47126e20bcbe758d00"} Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.368389 4839 generic.go:334] "Generic (PLEG): container finished" podID="f481fb0d-ac2f-4989-a547-50f5081e4e78" containerID="9c0964d074027bd8b20f7561440904d50f0f5ad70eed7435ed8da532c09da947" exitCode=0 Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.368783 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-48d5-account-create-update-5k79b" event={"ID":"f481fb0d-ac2f-4989-a547-50f5081e4e78","Type":"ContainerDied","Data":"9c0964d074027bd8b20f7561440904d50f0f5ad70eed7435ed8da532c09da947"} Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.371839 4839 generic.go:334] "Generic (PLEG): container finished" podID="9220ed3c-2e97-4efc-a4cc-28bb29774ad8" containerID="296c6956b7a45c772d2bc75858a9b2db91782289c6cf30854b24fd106bb5d692" exitCode=0 Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.371890 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ds7tq" event={"ID":"9220ed3c-2e97-4efc-a4cc-28bb29774ad8","Type":"ContainerDied","Data":"296c6956b7a45c772d2bc75858a9b2db91782289c6cf30854b24fd106bb5d692"} Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.377209 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" podStartSLOduration=3.37719068 podStartE2EDuration="3.37719068s" podCreationTimestamp="2026-03-21 04:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:44.371446669 +0000 UTC m=+1348.699233345" watchObservedRunningTime="2026-03-21 04:45:44.37719068 +0000 UTC m=+1348.704977356" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.385607 4839 generic.go:334] "Generic (PLEG): container finished" podID="46c56098-2959-4bd0-b762-36a4ee1bb2e6" containerID="ccd22af7723d538ca33a42ba3654ebdb55e8713c02134e6ab93cc893ad28c76a" exitCode=0 Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.385706 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w9wx6" event={"ID":"46c56098-2959-4bd0-b762-36a4ee1bb2e6","Type":"ContainerDied","Data":"ccd22af7723d538ca33a42ba3654ebdb55e8713c02134e6ab93cc893ad28c76a"} Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.393207 4839 generic.go:334] "Generic (PLEG): container finished" podID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerID="0bc7ef10848b0da5e68b6c3552cc343013046d2176bf665b0d2389f263149510" exitCode=137 Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.393301 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84c6c985f8-v5cmh" event={"ID":"b3b26c3a-55d5-442a-9c31-187b0aa60f90","Type":"ContainerDied","Data":"0bc7ef10848b0da5e68b6c3552cc343013046d2176bf665b0d2389f263149510"} Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.411837 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" podStartSLOduration=3.411817518 podStartE2EDuration="3.411817518s" podCreationTimestamp="2026-03-21 04:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:44.386998814 +0000 UTC m=+1348.714785510" watchObservedRunningTime="2026-03-21 04:45:44.411817518 +0000 UTC m=+1348.739604194" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.513688 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.678795 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.715712 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-scripts\") pod \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.716605 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3b26c3a-55d5-442a-9c31-187b0aa60f90-logs\") pod \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.716729 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-secret-key\") pod \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.716800 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmrcp\" (UniqueName: \"kubernetes.io/projected/b3b26c3a-55d5-442a-9c31-187b0aa60f90-kube-api-access-vmrcp\") pod \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.716839 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-tls-certs\") pod \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.716976 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-config-data\") pod \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.717026 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-combined-ca-bundle\") pod \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\" (UID: \"b3b26c3a-55d5-442a-9c31-187b0aa60f90\") " Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.722234 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3b26c3a-55d5-442a-9c31-187b0aa60f90-kube-api-access-vmrcp" (OuterVolumeSpecName: "kube-api-access-vmrcp") pod "b3b26c3a-55d5-442a-9c31-187b0aa60f90" (UID: "b3b26c3a-55d5-442a-9c31-187b0aa60f90"). InnerVolumeSpecName "kube-api-access-vmrcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.722485 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3b26c3a-55d5-442a-9c31-187b0aa60f90-logs" (OuterVolumeSpecName: "logs") pod "b3b26c3a-55d5-442a-9c31-187b0aa60f90" (UID: "b3b26c3a-55d5-442a-9c31-187b0aa60f90"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.725254 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b3b26c3a-55d5-442a-9c31-187b0aa60f90" (UID: "b3b26c3a-55d5-442a-9c31-187b0aa60f90"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.748057 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-config-data" (OuterVolumeSpecName: "config-data") pod "b3b26c3a-55d5-442a-9c31-187b0aa60f90" (UID: "b3b26c3a-55d5-442a-9c31-187b0aa60f90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.749198 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-scripts" (OuterVolumeSpecName: "scripts") pod "b3b26c3a-55d5-442a-9c31-187b0aa60f90" (UID: "b3b26c3a-55d5-442a-9c31-187b0aa60f90"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.749941 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3b26c3a-55d5-442a-9c31-187b0aa60f90" (UID: "b3b26c3a-55d5-442a-9c31-187b0aa60f90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.790128 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "b3b26c3a-55d5-442a-9c31-187b0aa60f90" (UID: "b3b26c3a-55d5-442a-9c31-187b0aa60f90"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.821283 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3b26c3a-55d5-442a-9c31-187b0aa60f90-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.821345 4839 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.821361 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmrcp\" (UniqueName: \"kubernetes.io/projected/b3b26c3a-55d5-442a-9c31-187b0aa60f90-kube-api-access-vmrcp\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.821374 4839 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.821386 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.821398 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3b26c3a-55d5-442a-9c31-187b0aa60f90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:44 crc kubenswrapper[4839]: I0321 04:45:44.821412 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3b26c3a-55d5-442a-9c31-187b0aa60f90-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.412152 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84c6c985f8-v5cmh" event={"ID":"b3b26c3a-55d5-442a-9c31-187b0aa60f90","Type":"ContainerDied","Data":"b5753b189f3fee68b09fb93ec56788b978b6a6741d48ecf04c45ca76fee101e1"} Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.412176 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84c6c985f8-v5cmh" Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.413383 4839 scope.go:117] "RemoveContainer" containerID="e004b9646c4df34c1d5bba67912a6fa76f3cccc25c7980ab777e369e37ce16c9" Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.418871 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df08a3cb-a9ae-4b8e-a9c8-604c41db5158","Type":"ContainerStarted","Data":"fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712"} Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.418911 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df08a3cb-a9ae-4b8e-a9c8-604c41db5158","Type":"ContainerStarted","Data":"815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e"} Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.421354 4839 generic.go:334] "Generic (PLEG): container finished" podID="60534a44-1538-4bdb-81d1-043c9ae84cee" containerID="2de908b5bd6bba55215cf326e7323c0123b89a96311bd62e86b355ee0ff19bc1" exitCode=0 Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.421466 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" event={"ID":"60534a44-1538-4bdb-81d1-043c9ae84cee","Type":"ContainerDied","Data":"2de908b5bd6bba55215cf326e7323c0123b89a96311bd62e86b355ee0ff19bc1"} Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.423119 4839 generic.go:334] "Generic (PLEG): container finished" podID="4185a56e-9d10-4aea-ad84-a865dff3e6be" containerID="c7f784ce54bb50fe64fb506149fb81059511360e35c18d47126e20bcbe758d00" exitCode=0 Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.423285 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" event={"ID":"4185a56e-9d10-4aea-ad84-a865dff3e6be","Type":"ContainerDied","Data":"c7f784ce54bb50fe64fb506149fb81059511360e35c18d47126e20bcbe758d00"} Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.483995 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84c6c985f8-v5cmh"] Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.504956 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-84c6c985f8-v5cmh"] Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.632752 4839 scope.go:117] "RemoveContainer" containerID="0bc7ef10848b0da5e68b6c3552cc343013046d2176bf665b0d2389f263149510" Mar 21 04:45:45 crc kubenswrapper[4839]: I0321 04:45:45.970959 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-48d5-account-create-update-5k79b" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.047339 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpg9b\" (UniqueName: \"kubernetes.io/projected/f481fb0d-ac2f-4989-a547-50f5081e4e78-kube-api-access-gpg9b\") pod \"f481fb0d-ac2f-4989-a547-50f5081e4e78\" (UID: \"f481fb0d-ac2f-4989-a547-50f5081e4e78\") " Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.047439 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f481fb0d-ac2f-4989-a547-50f5081e4e78-operator-scripts\") pod \"f481fb0d-ac2f-4989-a547-50f5081e4e78\" (UID: \"f481fb0d-ac2f-4989-a547-50f5081e4e78\") " Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.052823 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f481fb0d-ac2f-4989-a547-50f5081e4e78-kube-api-access-gpg9b" (OuterVolumeSpecName: "kube-api-access-gpg9b") pod "f481fb0d-ac2f-4989-a547-50f5081e4e78" (UID: "f481fb0d-ac2f-4989-a547-50f5081e4e78"). InnerVolumeSpecName "kube-api-access-gpg9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.069179 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f481fb0d-ac2f-4989-a547-50f5081e4e78-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f481fb0d-ac2f-4989-a547-50f5081e4e78" (UID: "f481fb0d-ac2f-4989-a547-50f5081e4e78"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.149033 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpg9b\" (UniqueName: \"kubernetes.io/projected/f481fb0d-ac2f-4989-a547-50f5081e4e78-kube-api-access-gpg9b\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.149066 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f481fb0d-ac2f-4989-a547-50f5081e4e78-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.185513 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4zz89" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.206347 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w9wx6" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.210763 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ds7tq" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.250177 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-operator-scripts\") pod \"9220ed3c-2e97-4efc-a4cc-28bb29774ad8\" (UID: \"9220ed3c-2e97-4efc-a4cc-28bb29774ad8\") " Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.250328 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c56098-2959-4bd0-b762-36a4ee1bb2e6-operator-scripts\") pod \"46c56098-2959-4bd0-b762-36a4ee1bb2e6\" (UID: \"46c56098-2959-4bd0-b762-36a4ee1bb2e6\") " Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.250374 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb2gw\" (UniqueName: \"kubernetes.io/projected/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-kube-api-access-mb2gw\") pod \"9220ed3c-2e97-4efc-a4cc-28bb29774ad8\" (UID: \"9220ed3c-2e97-4efc-a4cc-28bb29774ad8\") " Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.250422 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-978bj\" (UniqueName: \"kubernetes.io/projected/b76e9253-1495-42d5-910f-cce6f2730243-kube-api-access-978bj\") pod \"b76e9253-1495-42d5-910f-cce6f2730243\" (UID: \"b76e9253-1495-42d5-910f-cce6f2730243\") " Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.250486 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhvhx\" (UniqueName: \"kubernetes.io/projected/46c56098-2959-4bd0-b762-36a4ee1bb2e6-kube-api-access-bhvhx\") pod \"46c56098-2959-4bd0-b762-36a4ee1bb2e6\" (UID: \"46c56098-2959-4bd0-b762-36a4ee1bb2e6\") " Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.250515 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b76e9253-1495-42d5-910f-cce6f2730243-operator-scripts\") pod \"b76e9253-1495-42d5-910f-cce6f2730243\" (UID: \"b76e9253-1495-42d5-910f-cce6f2730243\") " Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.250642 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9220ed3c-2e97-4efc-a4cc-28bb29774ad8" (UID: "9220ed3c-2e97-4efc-a4cc-28bb29774ad8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.251049 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46c56098-2959-4bd0-b762-36a4ee1bb2e6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46c56098-2959-4bd0-b762-36a4ee1bb2e6" (UID: "46c56098-2959-4bd0-b762-36a4ee1bb2e6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.251417 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.251428 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b76e9253-1495-42d5-910f-cce6f2730243-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b76e9253-1495-42d5-910f-cce6f2730243" (UID: "b76e9253-1495-42d5-910f-cce6f2730243"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.251439 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c56098-2959-4bd0-b762-36a4ee1bb2e6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.257401 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46c56098-2959-4bd0-b762-36a4ee1bb2e6-kube-api-access-bhvhx" (OuterVolumeSpecName: "kube-api-access-bhvhx") pod "46c56098-2959-4bd0-b762-36a4ee1bb2e6" (UID: "46c56098-2959-4bd0-b762-36a4ee1bb2e6"). InnerVolumeSpecName "kube-api-access-bhvhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.273822 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-kube-api-access-mb2gw" (OuterVolumeSpecName: "kube-api-access-mb2gw") pod "9220ed3c-2e97-4efc-a4cc-28bb29774ad8" (UID: "9220ed3c-2e97-4efc-a4cc-28bb29774ad8"). InnerVolumeSpecName "kube-api-access-mb2gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.273939 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b76e9253-1495-42d5-910f-cce6f2730243-kube-api-access-978bj" (OuterVolumeSpecName: "kube-api-access-978bj") pod "b76e9253-1495-42d5-910f-cce6f2730243" (UID: "b76e9253-1495-42d5-910f-cce6f2730243"). InnerVolumeSpecName "kube-api-access-978bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.353439 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb2gw\" (UniqueName: \"kubernetes.io/projected/9220ed3c-2e97-4efc-a4cc-28bb29774ad8-kube-api-access-mb2gw\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.353487 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhvhx\" (UniqueName: \"kubernetes.io/projected/46c56098-2959-4bd0-b762-36a4ee1bb2e6-kube-api-access-bhvhx\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.353501 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-978bj\" (UniqueName: \"kubernetes.io/projected/b76e9253-1495-42d5-910f-cce6f2730243-kube-api-access-978bj\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.353519 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b76e9253-1495-42d5-910f-cce6f2730243-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.433807 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-ds7tq" event={"ID":"9220ed3c-2e97-4efc-a4cc-28bb29774ad8","Type":"ContainerDied","Data":"69b1f529d6acbca50b055efd49164192e5afe15ca2555525c0367f380a6d5b3e"} Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.433864 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69b1f529d6acbca50b055efd49164192e5afe15ca2555525c0367f380a6d5b3e" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.433863 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-ds7tq" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.435397 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w9wx6" event={"ID":"46c56098-2959-4bd0-b762-36a4ee1bb2e6","Type":"ContainerDied","Data":"4cd877ec810dc6f7d6a39c46cd7ecf7300f180e282dae1509e0c792ab4b45fc8"} Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.435451 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cd877ec810dc6f7d6a39c46cd7ecf7300f180e282dae1509e0c792ab4b45fc8" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.435408 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w9wx6" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.438332 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4zz89" event={"ID":"b76e9253-1495-42d5-910f-cce6f2730243","Type":"ContainerDied","Data":"54b14184b0a30e6f28cd2e9d592a640dccc616fa6f788aae4c5dcf3a458c8feb"} Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.438358 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54b14184b0a30e6f28cd2e9d592a640dccc616fa6f788aae4c5dcf3a458c8feb" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.438371 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4zz89" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.440560 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df08a3cb-a9ae-4b8e-a9c8-604c41db5158","Type":"ContainerStarted","Data":"5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be"} Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.442144 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-48d5-account-create-update-5k79b" event={"ID":"f481fb0d-ac2f-4989-a547-50f5081e4e78","Type":"ContainerDied","Data":"f92cef9a4e1b4a3b36fa3f0703a08a139186e14f9ea165ca6acea88ecdb50732"} Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.442187 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f92cef9a4e1b4a3b36fa3f0703a08a139186e14f9ea165ca6acea88ecdb50732" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.442210 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-48d5-account-create-update-5k79b" Mar 21 04:45:46 crc kubenswrapper[4839]: I0321 04:45:46.477770 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" path="/var/lib/kubelet/pods/b3b26c3a-55d5-442a-9c31-187b0aa60f90/volumes" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.048166 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.055677 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.067466 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4185a56e-9d10-4aea-ad84-a865dff3e6be-operator-scripts\") pod \"4185a56e-9d10-4aea-ad84-a865dff3e6be\" (UID: \"4185a56e-9d10-4aea-ad84-a865dff3e6be\") " Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.067536 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb7zf\" (UniqueName: \"kubernetes.io/projected/4185a56e-9d10-4aea-ad84-a865dff3e6be-kube-api-access-vb7zf\") pod \"4185a56e-9d10-4aea-ad84-a865dff3e6be\" (UID: \"4185a56e-9d10-4aea-ad84-a865dff3e6be\") " Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.067645 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j98k2\" (UniqueName: \"kubernetes.io/projected/60534a44-1538-4bdb-81d1-043c9ae84cee-kube-api-access-j98k2\") pod \"60534a44-1538-4bdb-81d1-043c9ae84cee\" (UID: \"60534a44-1538-4bdb-81d1-043c9ae84cee\") " Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.067785 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60534a44-1538-4bdb-81d1-043c9ae84cee-operator-scripts\") pod \"60534a44-1538-4bdb-81d1-043c9ae84cee\" (UID: \"60534a44-1538-4bdb-81d1-043c9ae84cee\") " Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.068277 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4185a56e-9d10-4aea-ad84-a865dff3e6be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4185a56e-9d10-4aea-ad84-a865dff3e6be" (UID: "4185a56e-9d10-4aea-ad84-a865dff3e6be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.069074 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60534a44-1538-4bdb-81d1-043c9ae84cee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "60534a44-1538-4bdb-81d1-043c9ae84cee" (UID: "60534a44-1538-4bdb-81d1-043c9ae84cee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.107034 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4185a56e-9d10-4aea-ad84-a865dff3e6be-kube-api-access-vb7zf" (OuterVolumeSpecName: "kube-api-access-vb7zf") pod "4185a56e-9d10-4aea-ad84-a865dff3e6be" (UID: "4185a56e-9d10-4aea-ad84-a865dff3e6be"). InnerVolumeSpecName "kube-api-access-vb7zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.107201 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60534a44-1538-4bdb-81d1-043c9ae84cee-kube-api-access-j98k2" (OuterVolumeSpecName: "kube-api-access-j98k2") pod "60534a44-1538-4bdb-81d1-043c9ae84cee" (UID: "60534a44-1538-4bdb-81d1-043c9ae84cee"). InnerVolumeSpecName "kube-api-access-j98k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.170225 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60534a44-1538-4bdb-81d1-043c9ae84cee-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.170256 4839 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4185a56e-9d10-4aea-ad84-a865dff3e6be-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.170268 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb7zf\" (UniqueName: \"kubernetes.io/projected/4185a56e-9d10-4aea-ad84-a865dff3e6be-kube-api-access-vb7zf\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.170279 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j98k2\" (UniqueName: \"kubernetes.io/projected/60534a44-1538-4bdb-81d1-043c9ae84cee-kube-api-access-j98k2\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.286131 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.376732 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-75bd8b89b4-djjlh" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.481220 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.481415 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-94b7-account-create-update-zmpzr" event={"ID":"60534a44-1538-4bdb-81d1-043c9ae84cee","Type":"ContainerDied","Data":"df524f3b5015131b55e090a47dcfb3d8225d4911cc5b551f8673ad913f2f5471"} Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.482490 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df524f3b5015131b55e090a47dcfb3d8225d4911cc5b551f8673ad913f2f5471" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.482516 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5788c8f798-khqlb"] Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.482769 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5788c8f798-khqlb" podUID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" containerName="placement-log" containerID="cri-o://50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345" gracePeriod=30 Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.483178 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5788c8f798-khqlb" podUID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" containerName="placement-api" containerID="cri-o://ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5" gracePeriod=30 Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.498147 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.498324 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-46c8-account-create-update-mp8jl" event={"ID":"4185a56e-9d10-4aea-ad84-a865dff3e6be","Type":"ContainerDied","Data":"5066435ed3d77bc5c33c59d562874afad187c31dc999c5e5a391a142f1d66cb0"} Mar 21 04:45:47 crc kubenswrapper[4839]: I0321 04:45:47.498439 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5066435ed3d77bc5c33c59d562874afad187c31dc999c5e5a391a142f1d66cb0" Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.396918 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.397722 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" containerName="glance-log" containerID="cri-o://7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc" gracePeriod=30 Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.397851 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" containerName="glance-httpd" containerID="cri-o://f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34" gracePeriod=30 Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.514078 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df08a3cb-a9ae-4b8e-a9c8-604c41db5158","Type":"ContainerStarted","Data":"5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717"} Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.514275 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="ceilometer-central-agent" containerID="cri-o://815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e" gracePeriod=30 Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.514629 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.514969 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="proxy-httpd" containerID="cri-o://5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717" gracePeriod=30 Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.515031 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="sg-core" containerID="cri-o://5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be" gracePeriod=30 Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.515078 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="ceilometer-notification-agent" containerID="cri-o://fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712" gracePeriod=30 Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.520337 4839 generic.go:334] "Generic (PLEG): container finished" podID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" containerID="50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345" exitCode=143 Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.520383 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5788c8f798-khqlb" event={"ID":"30c2fe46-cd8a-43f9-8968-b6e65d7c862a","Type":"ContainerDied","Data":"50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345"} Mar 21 04:45:48 crc kubenswrapper[4839]: I0321 04:45:48.555288 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9983906820000001 podStartE2EDuration="6.555268491s" podCreationTimestamp="2026-03-21 04:45:42 +0000 UTC" firstStartedPulling="2026-03-21 04:45:43.228986669 +0000 UTC m=+1347.556773345" lastFinishedPulling="2026-03-21 04:45:47.785864478 +0000 UTC m=+1352.113651154" observedRunningTime="2026-03-21 04:45:48.54985561 +0000 UTC m=+1352.877642286" watchObservedRunningTime="2026-03-21 04:45:48.555268491 +0000 UTC m=+1352.883055167" Mar 21 04:45:49 crc kubenswrapper[4839]: I0321 04:45:49.535706 4839 generic.go:334] "Generic (PLEG): container finished" podID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerID="5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717" exitCode=0 Mar 21 04:45:49 crc kubenswrapper[4839]: I0321 04:45:49.536025 4839 generic.go:334] "Generic (PLEG): container finished" podID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerID="5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be" exitCode=2 Mar 21 04:45:49 crc kubenswrapper[4839]: I0321 04:45:49.536060 4839 generic.go:334] "Generic (PLEG): container finished" podID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerID="fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712" exitCode=0 Mar 21 04:45:49 crc kubenswrapper[4839]: I0321 04:45:49.536110 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df08a3cb-a9ae-4b8e-a9c8-604c41db5158","Type":"ContainerDied","Data":"5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717"} Mar 21 04:45:49 crc kubenswrapper[4839]: I0321 04:45:49.536139 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df08a3cb-a9ae-4b8e-a9c8-604c41db5158","Type":"ContainerDied","Data":"5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be"} Mar 21 04:45:49 crc kubenswrapper[4839]: I0321 04:45:49.536152 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df08a3cb-a9ae-4b8e-a9c8-604c41db5158","Type":"ContainerDied","Data":"fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712"} Mar 21 04:45:49 crc kubenswrapper[4839]: I0321 04:45:49.538844 4839 generic.go:334] "Generic (PLEG): container finished" podID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" containerID="7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc" exitCode=143 Mar 21 04:45:49 crc kubenswrapper[4839]: I0321 04:45:49.538876 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"524772c8-3fdb-43dc-8532-1d8e9dcdeb97","Type":"ContainerDied","Data":"7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc"} Mar 21 04:45:50 crc kubenswrapper[4839]: I0321 04:45:50.103418 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:45:50 crc kubenswrapper[4839]: I0321 04:45:50.103688 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="506e1e04-5787-48bb-9165-96a55f0d3095" containerName="glance-log" containerID="cri-o://688009d7356d78e3eb36a5befafccac32153750022bd8fbc6ea8dbee86aced35" gracePeriod=30 Mar 21 04:45:50 crc kubenswrapper[4839]: I0321 04:45:50.103816 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="506e1e04-5787-48bb-9165-96a55f0d3095" containerName="glance-httpd" containerID="cri-o://9d897b01178474175025269c566e1858192f12c1b5756dd643a41a358a91f169" gracePeriod=30 Mar 21 04:45:50 crc kubenswrapper[4839]: I0321 04:45:50.559784 4839 generic.go:334] "Generic (PLEG): container finished" podID="506e1e04-5787-48bb-9165-96a55f0d3095" containerID="688009d7356d78e3eb36a5befafccac32153750022bd8fbc6ea8dbee86aced35" exitCode=143 Mar 21 04:45:50 crc kubenswrapper[4839]: I0321 04:45:50.560264 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"506e1e04-5787-48bb-9165-96a55f0d3095","Type":"ContainerDied","Data":"688009d7356d78e3eb36a5befafccac32153750022bd8fbc6ea8dbee86aced35"} Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.064926 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.257750 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-public-tls-certs\") pod \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.257824 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-config-data\") pod \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.257854 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-logs\") pod \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.257878 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nhpg\" (UniqueName: \"kubernetes.io/projected/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-kube-api-access-4nhpg\") pod \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.257966 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-internal-tls-certs\") pod \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.258085 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-scripts\") pod \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.258106 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-combined-ca-bundle\") pod \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\" (UID: \"30c2fe46-cd8a-43f9-8968-b6e65d7c862a\") " Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.258492 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-logs" (OuterVolumeSpecName: "logs") pod "30c2fe46-cd8a-43f9-8968-b6e65d7c862a" (UID: "30c2fe46-cd8a-43f9-8968-b6e65d7c862a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.263463 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-scripts" (OuterVolumeSpecName: "scripts") pod "30c2fe46-cd8a-43f9-8968-b6e65d7c862a" (UID: "30c2fe46-cd8a-43f9-8968-b6e65d7c862a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.268697 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-kube-api-access-4nhpg" (OuterVolumeSpecName: "kube-api-access-4nhpg") pod "30c2fe46-cd8a-43f9-8968-b6e65d7c862a" (UID: "30c2fe46-cd8a-43f9-8968-b6e65d7c862a"). InnerVolumeSpecName "kube-api-access-4nhpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.338299 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30c2fe46-cd8a-43f9-8968-b6e65d7c862a" (UID: "30c2fe46-cd8a-43f9-8968-b6e65d7c862a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.338820 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-config-data" (OuterVolumeSpecName: "config-data") pod "30c2fe46-cd8a-43f9-8968-b6e65d7c862a" (UID: "30c2fe46-cd8a-43f9-8968-b6e65d7c862a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.363048 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.363085 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.363100 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.363114 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.363126 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nhpg\" (UniqueName: \"kubernetes.io/projected/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-kube-api-access-4nhpg\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.382085 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "30c2fe46-cd8a-43f9-8968-b6e65d7c862a" (UID: "30c2fe46-cd8a-43f9-8968-b6e65d7c862a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.393737 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "30c2fe46-cd8a-43f9-8968-b6e65d7c862a" (UID: "30c2fe46-cd8a-43f9-8968-b6e65d7c862a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.465384 4839 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.465430 4839 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30c2fe46-cd8a-43f9-8968-b6e65d7c862a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.569554 4839 generic.go:334] "Generic (PLEG): container finished" podID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" containerID="ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5" exitCode=0 Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.569610 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5788c8f798-khqlb" event={"ID":"30c2fe46-cd8a-43f9-8968-b6e65d7c862a","Type":"ContainerDied","Data":"ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5"} Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.569636 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5788c8f798-khqlb" event={"ID":"30c2fe46-cd8a-43f9-8968-b6e65d7c862a","Type":"ContainerDied","Data":"3897ef34a5a560221b0da70d53a0118dcc2423f236d8ea84230926286a71f6ee"} Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.569640 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5788c8f798-khqlb" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.569651 4839 scope.go:117] "RemoveContainer" containerID="ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.629234 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5788c8f798-khqlb"] Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.636763 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5788c8f798-khqlb"] Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.652123 4839 scope.go:117] "RemoveContainer" containerID="50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.678265 4839 scope.go:117] "RemoveContainer" containerID="ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5" Mar 21 04:45:51 crc kubenswrapper[4839]: E0321 04:45:51.678929 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5\": container with ID starting with ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5 not found: ID does not exist" containerID="ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.678997 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5"} err="failed to get container status \"ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5\": rpc error: code = NotFound desc = could not find container \"ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5\": container with ID starting with ab24204f92b48f3bd0e1ca51a011a08b99bfb2f39778550bf43149f79a695df5 not found: ID does not exist" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.679030 4839 scope.go:117] "RemoveContainer" containerID="50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345" Mar 21 04:45:51 crc kubenswrapper[4839]: E0321 04:45:51.682415 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345\": container with ID starting with 50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345 not found: ID does not exist" containerID="50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.682474 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345"} err="failed to get container status \"50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345\": rpc error: code = NotFound desc = could not find container \"50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345\": container with ID starting with 50769488e00d55eb3429e84962a3785fccd7fc5850dfd3db47f0ac5b5be66345 not found: ID does not exist" Mar 21 04:45:51 crc kubenswrapper[4839]: I0321 04:45:51.991226 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.077765 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-scripts\") pod \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.078096 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgs9m\" (UniqueName: \"kubernetes.io/projected/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-kube-api-access-kgs9m\") pod \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.078191 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-logs\") pod \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.078227 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-public-tls-certs\") pod \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.078307 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.078346 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-config-data\") pod \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.078378 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-combined-ca-bundle\") pod \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.078405 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-httpd-run\") pod \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\" (UID: \"524772c8-3fdb-43dc-8532-1d8e9dcdeb97\") " Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.078428 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-logs" (OuterVolumeSpecName: "logs") pod "524772c8-3fdb-43dc-8532-1d8e9dcdeb97" (UID: "524772c8-3fdb-43dc-8532-1d8e9dcdeb97"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.078854 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.079080 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "524772c8-3fdb-43dc-8532-1d8e9dcdeb97" (UID: "524772c8-3fdb-43dc-8532-1d8e9dcdeb97"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.094456 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "524772c8-3fdb-43dc-8532-1d8e9dcdeb97" (UID: "524772c8-3fdb-43dc-8532-1d8e9dcdeb97"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.094881 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-kube-api-access-kgs9m" (OuterVolumeSpecName: "kube-api-access-kgs9m") pod "524772c8-3fdb-43dc-8532-1d8e9dcdeb97" (UID: "524772c8-3fdb-43dc-8532-1d8e9dcdeb97"). InnerVolumeSpecName "kube-api-access-kgs9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.096734 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-scripts" (OuterVolumeSpecName: "scripts") pod "524772c8-3fdb-43dc-8532-1d8e9dcdeb97" (UID: "524772c8-3fdb-43dc-8532-1d8e9dcdeb97"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.109385 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "524772c8-3fdb-43dc-8532-1d8e9dcdeb97" (UID: "524772c8-3fdb-43dc-8532-1d8e9dcdeb97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.179543 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.179746 4839 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.179759 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.179768 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgs9m\" (UniqueName: \"kubernetes.io/projected/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-kube-api-access-kgs9m\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.179798 4839 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.214660 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5dvtr"] Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215384 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" containerName="glance-httpd" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215412 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" containerName="glance-httpd" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215435 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" containerName="placement-log" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215442 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" containerName="placement-log" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215450 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" containerName="glance-log" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215460 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" containerName="glance-log" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215470 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215477 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215493 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon-log" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215499 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon-log" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215515 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" containerName="placement-api" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215520 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" containerName="placement-api" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215535 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60534a44-1538-4bdb-81d1-043c9ae84cee" containerName="mariadb-account-create-update" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215541 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="60534a44-1538-4bdb-81d1-043c9ae84cee" containerName="mariadb-account-create-update" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215557 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4185a56e-9d10-4aea-ad84-a865dff3e6be" containerName="mariadb-account-create-update" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215582 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="4185a56e-9d10-4aea-ad84-a865dff3e6be" containerName="mariadb-account-create-update" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215593 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76e9253-1495-42d5-910f-cce6f2730243" containerName="mariadb-database-create" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215599 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76e9253-1495-42d5-910f-cce6f2730243" containerName="mariadb-database-create" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215620 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9220ed3c-2e97-4efc-a4cc-28bb29774ad8" containerName="mariadb-database-create" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215626 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="9220ed3c-2e97-4efc-a4cc-28bb29774ad8" containerName="mariadb-database-create" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215640 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c56098-2959-4bd0-b762-36a4ee1bb2e6" containerName="mariadb-database-create" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215651 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c56098-2959-4bd0-b762-36a4ee1bb2e6" containerName="mariadb-database-create" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.215664 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f481fb0d-ac2f-4989-a547-50f5081e4e78" containerName="mariadb-account-create-update" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215670 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f481fb0d-ac2f-4989-a547-50f5081e4e78" containerName="mariadb-account-create-update" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215866 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" containerName="glance-log" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215878 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76e9253-1495-42d5-910f-cce6f2730243" containerName="mariadb-database-create" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215894 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="9220ed3c-2e97-4efc-a4cc-28bb29774ad8" containerName="mariadb-database-create" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215905 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="46c56098-2959-4bd0-b762-36a4ee1bb2e6" containerName="mariadb-database-create" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215912 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" containerName="placement-api" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215929 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" containerName="placement-log" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215940 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="60534a44-1538-4bdb-81d1-043c9ae84cee" containerName="mariadb-account-create-update" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215949 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215957 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="4185a56e-9d10-4aea-ad84-a865dff3e6be" containerName="mariadb-account-create-update" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215964 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" containerName="glance-httpd" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215973 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f481fb0d-ac2f-4989-a547-50f5081e4e78" containerName="mariadb-account-create-update" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.215985 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3b26c3a-55d5-442a-9c31-187b0aa60f90" containerName="horizon-log" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.216880 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.220732 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.221072 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.221222 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-t66x4" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.226593 4839 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.231808 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "524772c8-3fdb-43dc-8532-1d8e9dcdeb97" (UID: "524772c8-3fdb-43dc-8532-1d8e9dcdeb97"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.240824 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5dvtr"] Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.242633 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-config-data" (OuterVolumeSpecName: "config-data") pod "524772c8-3fdb-43dc-8532-1d8e9dcdeb97" (UID: "524772c8-3fdb-43dc-8532-1d8e9dcdeb97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.283981 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmr76\" (UniqueName: \"kubernetes.io/projected/bbaf057c-375e-4da6-a7cd-8c879a51ff50-kube-api-access-kmr76\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.284046 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-config-data\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.284067 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.284194 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-scripts\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.284255 4839 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.284270 4839 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.284361 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/524772c8-3fdb-43dc-8532-1d8e9dcdeb97-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.385603 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-scripts\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.385679 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmr76\" (UniqueName: \"kubernetes.io/projected/bbaf057c-375e-4da6-a7cd-8c879a51ff50-kube-api-access-kmr76\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.385707 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-config-data\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.385723 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.391693 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-scripts\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.392186 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.392708 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-config-data\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.416473 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmr76\" (UniqueName: \"kubernetes.io/projected/bbaf057c-375e-4da6-a7cd-8c879a51ff50-kube-api-access-kmr76\") pod \"nova-cell0-conductor-db-sync-5dvtr\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.463712 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30c2fe46-cd8a-43f9-8968-b6e65d7c862a" path="/var/lib/kubelet/pods/30c2fe46-cd8a-43f9-8968-b6e65d7c862a/volumes" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.548266 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.581748 4839 generic.go:334] "Generic (PLEG): container finished" podID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" containerID="f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34" exitCode=0 Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.581807 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"524772c8-3fdb-43dc-8532-1d8e9dcdeb97","Type":"ContainerDied","Data":"f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34"} Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.581839 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"524772c8-3fdb-43dc-8532-1d8e9dcdeb97","Type":"ContainerDied","Data":"d1e4b5b263d8711e41038cc9c72c0cf72e4c984c445036bd50e6733715123ea1"} Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.581860 4839 scope.go:117] "RemoveContainer" containerID="f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.582005 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.690186 4839 scope.go:117] "RemoveContainer" containerID="7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.697337 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.705059 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.718723 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.720525 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.723700 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.724360 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.728524 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.744685 4839 scope.go:117] "RemoveContainer" containerID="f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.745483 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34\": container with ID starting with f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34 not found: ID does not exist" containerID="f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.745521 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34"} err="failed to get container status \"f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34\": rpc error: code = NotFound desc = could not find container \"f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34\": container with ID starting with f646a915486b1a93d441b9746d2db8ddcc03de6d349fab6b4e064f47d3119f34 not found: ID does not exist" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.745545 4839 scope.go:117] "RemoveContainer" containerID="7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc" Mar 21 04:45:52 crc kubenswrapper[4839]: E0321 04:45:52.746082 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc\": container with ID starting with 7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc not found: ID does not exist" containerID="7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.746161 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc"} err="failed to get container status \"7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc\": rpc error: code = NotFound desc = could not find container \"7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc\": container with ID starting with 7a15ea0b23f3e1ed7bc4a276314021fe6cacf15124fb851f6f27bcccdd4586fc not found: ID does not exist" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.895499 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q55mb\" (UniqueName: \"kubernetes.io/projected/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-kube-api-access-q55mb\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.895555 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-scripts\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.895689 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-logs\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.895735 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.895774 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.895841 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.895897 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.895920 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-config-data\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.996917 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.996993 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.997035 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.997054 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-config-data\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.997082 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q55mb\" (UniqueName: \"kubernetes.io/projected/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-kube-api-access-q55mb\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.997101 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-scripts\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.997140 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-logs\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.997170 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.997541 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.998272 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:52 crc kubenswrapper[4839]: I0321 04:45:52.998273 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-logs\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.002781 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-scripts\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.003939 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-config-data\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.007467 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.008282 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.015825 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q55mb\" (UniqueName: \"kubernetes.io/projected/3e3e15ec-7425-4e0a-99a8-db3bb1cd486c-kube-api-access-q55mb\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.027730 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c\") " pod="openstack/glance-default-external-api-0" Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.038837 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5dvtr"] Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.061689 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.592885 4839 generic.go:334] "Generic (PLEG): container finished" podID="506e1e04-5787-48bb-9165-96a55f0d3095" containerID="9d897b01178474175025269c566e1858192f12c1b5756dd643a41a358a91f169" exitCode=0 Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.592939 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"506e1e04-5787-48bb-9165-96a55f0d3095","Type":"ContainerDied","Data":"9d897b01178474175025269c566e1858192f12c1b5756dd643a41a358a91f169"} Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.595540 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5dvtr" event={"ID":"bbaf057c-375e-4da6-a7cd-8c879a51ff50","Type":"ContainerStarted","Data":"167a9bd8f5cfd7d579c6c62283502e438b5ae393020828f4bac8087f747ad53c"} Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.618770 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 21 04:45:53 crc kubenswrapper[4839]: I0321 04:45:53.858817 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.019409 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-combined-ca-bundle\") pod \"506e1e04-5787-48bb-9165-96a55f0d3095\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.019477 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-internal-tls-certs\") pod \"506e1e04-5787-48bb-9165-96a55f0d3095\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.019501 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-scripts\") pod \"506e1e04-5787-48bb-9165-96a55f0d3095\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.019550 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-config-data\") pod \"506e1e04-5787-48bb-9165-96a55f0d3095\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.019614 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-httpd-run\") pod \"506e1e04-5787-48bb-9165-96a55f0d3095\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.019658 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-logs\") pod \"506e1e04-5787-48bb-9165-96a55f0d3095\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.019682 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfhfr\" (UniqueName: \"kubernetes.io/projected/506e1e04-5787-48bb-9165-96a55f0d3095-kube-api-access-tfhfr\") pod \"506e1e04-5787-48bb-9165-96a55f0d3095\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.019707 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"506e1e04-5787-48bb-9165-96a55f0d3095\" (UID: \"506e1e04-5787-48bb-9165-96a55f0d3095\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.020980 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "506e1e04-5787-48bb-9165-96a55f0d3095" (UID: "506e1e04-5787-48bb-9165-96a55f0d3095"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.021081 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-logs" (OuterVolumeSpecName: "logs") pod "506e1e04-5787-48bb-9165-96a55f0d3095" (UID: "506e1e04-5787-48bb-9165-96a55f0d3095"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.031026 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/506e1e04-5787-48bb-9165-96a55f0d3095-kube-api-access-tfhfr" (OuterVolumeSpecName: "kube-api-access-tfhfr") pod "506e1e04-5787-48bb-9165-96a55f0d3095" (UID: "506e1e04-5787-48bb-9165-96a55f0d3095"). InnerVolumeSpecName "kube-api-access-tfhfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.031106 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-scripts" (OuterVolumeSpecName: "scripts") pod "506e1e04-5787-48bb-9165-96a55f0d3095" (UID: "506e1e04-5787-48bb-9165-96a55f0d3095"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.035456 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "506e1e04-5787-48bb-9165-96a55f0d3095" (UID: "506e1e04-5787-48bb-9165-96a55f0d3095"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.061896 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.088390 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-config-data" (OuterVolumeSpecName: "config-data") pod "506e1e04-5787-48bb-9165-96a55f0d3095" (UID: "506e1e04-5787-48bb-9165-96a55f0d3095"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.094001 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "506e1e04-5787-48bb-9165-96a55f0d3095" (UID: "506e1e04-5787-48bb-9165-96a55f0d3095"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.095677 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "506e1e04-5787-48bb-9165-96a55f0d3095" (UID: "506e1e04-5787-48bb-9165-96a55f0d3095"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.121416 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-config-data\") pod \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.121875 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-run-httpd\") pod \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.121905 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-scripts\") pod \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.121955 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-combined-ca-bundle\") pod \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.121985 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldtbn\" (UniqueName: \"kubernetes.io/projected/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-kube-api-access-ldtbn\") pod \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.122033 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-log-httpd\") pod \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.122065 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-sg-core-conf-yaml\") pod \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\" (UID: \"df08a3cb-a9ae-4b8e-a9c8-604c41db5158\") " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.123215 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "df08a3cb-a9ae-4b8e-a9c8-604c41db5158" (UID: "df08a3cb-a9ae-4b8e-a9c8-604c41db5158"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.123558 4839 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.123684 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/506e1e04-5787-48bb-9165-96a55f0d3095-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.123697 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfhfr\" (UniqueName: \"kubernetes.io/projected/506e1e04-5787-48bb-9165-96a55f0d3095-kube-api-access-tfhfr\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.123717 4839 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.123727 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.123735 4839 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.123743 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.123751 4839 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.123758 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/506e1e04-5787-48bb-9165-96a55f0d3095-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.126870 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "df08a3cb-a9ae-4b8e-a9c8-604c41db5158" (UID: "df08a3cb-a9ae-4b8e-a9c8-604c41db5158"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.128784 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-kube-api-access-ldtbn" (OuterVolumeSpecName: "kube-api-access-ldtbn") pod "df08a3cb-a9ae-4b8e-a9c8-604c41db5158" (UID: "df08a3cb-a9ae-4b8e-a9c8-604c41db5158"). InnerVolumeSpecName "kube-api-access-ldtbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.129386 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-scripts" (OuterVolumeSpecName: "scripts") pod "df08a3cb-a9ae-4b8e-a9c8-604c41db5158" (UID: "df08a3cb-a9ae-4b8e-a9c8-604c41db5158"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.168191 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "df08a3cb-a9ae-4b8e-a9c8-604c41db5158" (UID: "df08a3cb-a9ae-4b8e-a9c8-604c41db5158"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.173547 4839 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.226043 4839 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.226369 4839 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.226382 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.226393 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldtbn\" (UniqueName: \"kubernetes.io/projected/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-kube-api-access-ldtbn\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.226403 4839 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.226488 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df08a3cb-a9ae-4b8e-a9c8-604c41db5158" (UID: "df08a3cb-a9ae-4b8e-a9c8-604c41db5158"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.248733 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-config-data" (OuterVolumeSpecName: "config-data") pod "df08a3cb-a9ae-4b8e-a9c8-604c41db5158" (UID: "df08a3cb-a9ae-4b8e-a9c8-604c41db5158"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.329767 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.329806 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df08a3cb-a9ae-4b8e-a9c8-604c41db5158-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.473143 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="524772c8-3fdb-43dc-8532-1d8e9dcdeb97" path="/var/lib/kubelet/pods/524772c8-3fdb-43dc-8532-1d8e9dcdeb97/volumes" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.618495 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"506e1e04-5787-48bb-9165-96a55f0d3095","Type":"ContainerDied","Data":"b3b20c1c58919d92014e2b8b23b7b38a20303479312dd8c6c51224fcd3f18728"} Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.618543 4839 scope.go:117] "RemoveContainer" containerID="9d897b01178474175025269c566e1858192f12c1b5756dd643a41a358a91f169" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.618681 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.625437 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c","Type":"ContainerStarted","Data":"5a8717c8e282ed1a9dd0c8621b1e357e6005c3b9225ddebfca3234dc9e7a1b1c"} Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.625810 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c","Type":"ContainerStarted","Data":"edef24dd73386b464b06e73affcab4a9872357de542f95bf3af4a775e757c502"} Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.632767 4839 generic.go:334] "Generic (PLEG): container finished" podID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerID="815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e" exitCode=0 Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.632809 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df08a3cb-a9ae-4b8e-a9c8-604c41db5158","Type":"ContainerDied","Data":"815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e"} Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.632835 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df08a3cb-a9ae-4b8e-a9c8-604c41db5158","Type":"ContainerDied","Data":"41ff38380ac8ed55675761ad2bd4b24ee85da709e085d038d76ac53207f2c9ae"} Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.632890 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.655709 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.665737 4839 scope.go:117] "RemoveContainer" containerID="688009d7356d78e3eb36a5befafccac32153750022bd8fbc6ea8dbee86aced35" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.674075 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.705668 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.725382 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.758132 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.759708 4839 scope.go:117] "RemoveContainer" containerID="5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717" Mar 21 04:45:54 crc kubenswrapper[4839]: E0321 04:45:54.760854 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="ceilometer-central-agent" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.760882 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="ceilometer-central-agent" Mar 21 04:45:54 crc kubenswrapper[4839]: E0321 04:45:54.760899 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="sg-core" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.760906 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="sg-core" Mar 21 04:45:54 crc kubenswrapper[4839]: E0321 04:45:54.760920 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="506e1e04-5787-48bb-9165-96a55f0d3095" containerName="glance-log" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.760927 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="506e1e04-5787-48bb-9165-96a55f0d3095" containerName="glance-log" Mar 21 04:45:54 crc kubenswrapper[4839]: E0321 04:45:54.760937 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="506e1e04-5787-48bb-9165-96a55f0d3095" containerName="glance-httpd" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.760943 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="506e1e04-5787-48bb-9165-96a55f0d3095" containerName="glance-httpd" Mar 21 04:45:54 crc kubenswrapper[4839]: E0321 04:45:54.760967 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="proxy-httpd" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.760975 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="proxy-httpd" Mar 21 04:45:54 crc kubenswrapper[4839]: E0321 04:45:54.761001 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="ceilometer-notification-agent" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.761008 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="ceilometer-notification-agent" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.761293 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="ceilometer-notification-agent" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.761310 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="506e1e04-5787-48bb-9165-96a55f0d3095" containerName="glance-log" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.761321 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="ceilometer-central-agent" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.761330 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="sg-core" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.761338 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" containerName="proxy-httpd" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.761355 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="506e1e04-5787-48bb-9165-96a55f0d3095" containerName="glance-httpd" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.762481 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.766687 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.766870 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.802107 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.808803 4839 scope.go:117] "RemoveContainer" containerID="5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.824983 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.827979 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.832539 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.832707 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.850153 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-run-httpd\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.850390 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.850423 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.850444 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-config-data\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.850479 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-log-httpd\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.850505 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp7fg\" (UniqueName: \"kubernetes.io/projected/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-kube-api-access-cp7fg\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.850560 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-scripts\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.857819 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.886435 4839 scope.go:117] "RemoveContainer" containerID="fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.924989 4839 scope.go:117] "RemoveContainer" containerID="815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.950641 4839 scope.go:117] "RemoveContainer" containerID="5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717" Mar 21 04:45:54 crc kubenswrapper[4839]: E0321 04:45:54.951400 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717\": container with ID starting with 5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717 not found: ID does not exist" containerID="5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.951456 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717"} err="failed to get container status \"5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717\": rpc error: code = NotFound desc = could not find container \"5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717\": container with ID starting with 5b4cfe3ae4d579e3c19a2481eace665e5e33eabedf68c279bc80562968a22717 not found: ID does not exist" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.951489 4839 scope.go:117] "RemoveContainer" containerID="5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be" Mar 21 04:45:54 crc kubenswrapper[4839]: E0321 04:45:54.951914 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be\": container with ID starting with 5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be not found: ID does not exist" containerID="5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.952107 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be"} err="failed to get container status \"5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be\": rpc error: code = NotFound desc = could not find container \"5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be\": container with ID starting with 5e8e76a9a5316a3f376f3d393cce449d73ead54bab022e83ee496cd3fcbd47be not found: ID does not exist" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953135 4839 scope.go:117] "RemoveContainer" containerID="fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.952002 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953456 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953488 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953520 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-config-data\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953581 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7aa4192-53bb-412e-b25e-1fe47c59fa75-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953620 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-log-httpd\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953659 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp7fg\" (UniqueName: \"kubernetes.io/projected/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-kube-api-access-cp7fg\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953692 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7aa4192-53bb-412e-b25e-1fe47c59fa75-logs\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953716 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953773 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-scripts\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953920 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953962 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.953985 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-run-httpd\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.954087 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qzpf\" (UniqueName: \"kubernetes.io/projected/c7aa4192-53bb-412e-b25e-1fe47c59fa75-kube-api-access-4qzpf\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.954180 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.954471 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-log-httpd\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.954693 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-run-httpd\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.959145 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.960588 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.962716 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-scripts\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: E0321 04:45:54.963807 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712\": container with ID starting with fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712 not found: ID does not exist" containerID="fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.963864 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712"} err="failed to get container status \"fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712\": rpc error: code = NotFound desc = could not find container \"fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712\": container with ID starting with fc0f3fd92cb474b1eb7bf19080b6bc0cbd204d90f6d9a3fb300c3450c2d63712 not found: ID does not exist" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.963897 4839 scope.go:117] "RemoveContainer" containerID="815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e" Mar 21 04:45:54 crc kubenswrapper[4839]: E0321 04:45:54.965061 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e\": container with ID starting with 815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e not found: ID does not exist" containerID="815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.965099 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e"} err="failed to get container status \"815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e\": rpc error: code = NotFound desc = could not find container \"815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e\": container with ID starting with 815ecf57604f93f58650798c5566709077c42cf6955a2a6a4da70402fb94d50e not found: ID does not exist" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.967182 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-config-data\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:54 crc kubenswrapper[4839]: I0321 04:45:54.974548 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp7fg\" (UniqueName: \"kubernetes.io/projected/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-kube-api-access-cp7fg\") pod \"ceilometer-0\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " pod="openstack/ceilometer-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.056383 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.056884 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7aa4192-53bb-412e-b25e-1fe47c59fa75-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.056973 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7aa4192-53bb-412e-b25e-1fe47c59fa75-logs\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.057045 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.057193 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.057333 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.057920 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qzpf\" (UniqueName: \"kubernetes.io/projected/c7aa4192-53bb-412e-b25e-1fe47c59fa75-kube-api-access-4qzpf\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.058005 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.056980 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.060222 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7aa4192-53bb-412e-b25e-1fe47c59fa75-logs\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.060454 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7aa4192-53bb-412e-b25e-1fe47c59fa75-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.064429 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.070649 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.071497 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.076721 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7aa4192-53bb-412e-b25e-1fe47c59fa75-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.081585 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qzpf\" (UniqueName: \"kubernetes.io/projected/c7aa4192-53bb-412e-b25e-1fe47c59fa75-kube-api-access-4qzpf\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.105001 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"c7aa4192-53bb-412e-b25e-1fe47c59fa75\") " pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.163924 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.394544 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.651354 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3e3e15ec-7425-4e0a-99a8-db3bb1cd486c","Type":"ContainerStarted","Data":"9e1322d5a22397b4c22be78ee22e362bc9ad41eba5ef21736d759c4b8060bf8e"} Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.677363 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.6773410220000002 podStartE2EDuration="3.677341022s" podCreationTimestamp="2026-03-21 04:45:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:55.671996112 +0000 UTC m=+1359.999782798" watchObservedRunningTime="2026-03-21 04:45:55.677341022 +0000 UTC m=+1360.005127698" Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.742513 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:45:55 crc kubenswrapper[4839]: W0321 04:45:55.753953 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d47edf7_95e3_4bb5_ab87_27c9db5d05d6.slice/crio-d75d884bbe4a31f233cb211086c9cf5693ae2ba4faf11ef4b832e01a60ea7483 WatchSource:0}: Error finding container d75d884bbe4a31f233cb211086c9cf5693ae2ba4faf11ef4b832e01a60ea7483: Status 404 returned error can't find the container with id d75d884bbe4a31f233cb211086c9cf5693ae2ba4faf11ef4b832e01a60ea7483 Mar 21 04:45:55 crc kubenswrapper[4839]: I0321 04:45:55.926732 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 21 04:45:56 crc kubenswrapper[4839]: I0321 04:45:56.464417 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="506e1e04-5787-48bb-9165-96a55f0d3095" path="/var/lib/kubelet/pods/506e1e04-5787-48bb-9165-96a55f0d3095/volumes" Mar 21 04:45:56 crc kubenswrapper[4839]: I0321 04:45:56.465503 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df08a3cb-a9ae-4b8e-a9c8-604c41db5158" path="/var/lib/kubelet/pods/df08a3cb-a9ae-4b8e-a9c8-604c41db5158/volumes" Mar 21 04:45:56 crc kubenswrapper[4839]: I0321 04:45:56.680487 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7aa4192-53bb-412e-b25e-1fe47c59fa75","Type":"ContainerStarted","Data":"e89b3f506e5a960d98bb0c12eb2af752b39195c18d53d7ed12721c0cbc928e86"} Mar 21 04:45:56 crc kubenswrapper[4839]: I0321 04:45:56.680537 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7aa4192-53bb-412e-b25e-1fe47c59fa75","Type":"ContainerStarted","Data":"cc947298e9a397670be68e795db4fd3f3eed3e7514187eed9facc497f7066d52"} Mar 21 04:45:56 crc kubenswrapper[4839]: I0321 04:45:56.686872 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6","Type":"ContainerStarted","Data":"cbba2b10323381d9b303a23b6607bd17c5906d7437bb21c852b760d41642da03"} Mar 21 04:45:56 crc kubenswrapper[4839]: I0321 04:45:56.686955 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6","Type":"ContainerStarted","Data":"d75d884bbe4a31f233cb211086c9cf5693ae2ba4faf11ef4b832e01a60ea7483"} Mar 21 04:45:57 crc kubenswrapper[4839]: I0321 04:45:57.698375 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7aa4192-53bb-412e-b25e-1fe47c59fa75","Type":"ContainerStarted","Data":"29c2b41c34c58734758f0c4fa9ee85718e44617f6aedee7277175ff1043e3cd3"} Mar 21 04:45:57 crc kubenswrapper[4839]: I0321 04:45:57.732386 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.732369502 podStartE2EDuration="3.732369502s" podCreationTimestamp="2026-03-21 04:45:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:45:57.718972445 +0000 UTC m=+1362.046759121" watchObservedRunningTime="2026-03-21 04:45:57.732369502 +0000 UTC m=+1362.060156178" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.146365 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567806-g4rcl"] Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.148127 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567806-g4rcl" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.150165 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.150889 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.151381 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.158676 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567806-g4rcl"] Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.267155 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88ds5\" (UniqueName: \"kubernetes.io/projected/75c1454e-0aed-48d9-a0f2-f7c2797156ce-kube-api-access-88ds5\") pod \"auto-csr-approver-29567806-g4rcl\" (UID: \"75c1454e-0aed-48d9-a0f2-f7c2797156ce\") " pod="openshift-infra/auto-csr-approver-29567806-g4rcl" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.370063 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88ds5\" (UniqueName: \"kubernetes.io/projected/75c1454e-0aed-48d9-a0f2-f7c2797156ce-kube-api-access-88ds5\") pod \"auto-csr-approver-29567806-g4rcl\" (UID: \"75c1454e-0aed-48d9-a0f2-f7c2797156ce\") " pod="openshift-infra/auto-csr-approver-29567806-g4rcl" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.389952 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88ds5\" (UniqueName: \"kubernetes.io/projected/75c1454e-0aed-48d9-a0f2-f7c2797156ce-kube-api-access-88ds5\") pod \"auto-csr-approver-29567806-g4rcl\" (UID: \"75c1454e-0aed-48d9-a0f2-f7c2797156ce\") " pod="openshift-infra/auto-csr-approver-29567806-g4rcl" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.471761 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567806-g4rcl" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.980041 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.980261 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.980320 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.981213 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"48bb6d2443587cf3023178aa72ea424c113f55b1e7600821dbf21c214de8e70f"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:46:00 crc kubenswrapper[4839]: I0321 04:46:00.981271 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://48bb6d2443587cf3023178aa72ea424c113f55b1e7600821dbf21c214de8e70f" gracePeriod=600 Mar 21 04:46:01 crc kubenswrapper[4839]: I0321 04:46:01.681965 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567806-g4rcl"] Mar 21 04:46:01 crc kubenswrapper[4839]: W0321 04:46:01.682584 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75c1454e_0aed_48d9_a0f2_f7c2797156ce.slice/crio-bdf7c9c6ed03e29549d652d87f7937c2f76fe41b3839872da46b59ddd063bb8e WatchSource:0}: Error finding container bdf7c9c6ed03e29549d652d87f7937c2f76fe41b3839872da46b59ddd063bb8e: Status 404 returned error can't find the container with id bdf7c9c6ed03e29549d652d87f7937c2f76fe41b3839872da46b59ddd063bb8e Mar 21 04:46:01 crc kubenswrapper[4839]: I0321 04:46:01.735622 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6","Type":"ContainerStarted","Data":"1c072f26f9d106f6161eb66b3f6c9a76cc9db2ce4ede2775c0501c104b78c25c"} Mar 21 04:46:01 crc kubenswrapper[4839]: I0321 04:46:01.738667 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5dvtr" event={"ID":"bbaf057c-375e-4da6-a7cd-8c879a51ff50","Type":"ContainerStarted","Data":"a57d3dec4c234a21b088b3986b8d9a4b8012dec53cc26619ad9bdd0f9475d8cc"} Mar 21 04:46:01 crc kubenswrapper[4839]: I0321 04:46:01.741413 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="48bb6d2443587cf3023178aa72ea424c113f55b1e7600821dbf21c214de8e70f" exitCode=0 Mar 21 04:46:01 crc kubenswrapper[4839]: I0321 04:46:01.741476 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"48bb6d2443587cf3023178aa72ea424c113f55b1e7600821dbf21c214de8e70f"} Mar 21 04:46:01 crc kubenswrapper[4839]: I0321 04:46:01.741729 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"c031ed8f7b7576f57e9530a46687f2f2de2e5c2a62f42435eef393cfd7af2b37"} Mar 21 04:46:01 crc kubenswrapper[4839]: I0321 04:46:01.741807 4839 scope.go:117] "RemoveContainer" containerID="3ca17db50991abbb7e584e1a028ac5195afd6abd747f7e5e9969a64ed39bcf6c" Mar 21 04:46:01 crc kubenswrapper[4839]: I0321 04:46:01.743413 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567806-g4rcl" event={"ID":"75c1454e-0aed-48d9-a0f2-f7c2797156ce","Type":"ContainerStarted","Data":"bdf7c9c6ed03e29549d652d87f7937c2f76fe41b3839872da46b59ddd063bb8e"} Mar 21 04:46:01 crc kubenswrapper[4839]: I0321 04:46:01.763909 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-5dvtr" podStartSLOduration=1.453500151 podStartE2EDuration="9.763888061s" podCreationTimestamp="2026-03-21 04:45:52 +0000 UTC" firstStartedPulling="2026-03-21 04:45:53.036410662 +0000 UTC m=+1357.364197338" lastFinishedPulling="2026-03-21 04:46:01.346798572 +0000 UTC m=+1365.674585248" observedRunningTime="2026-03-21 04:46:01.759748135 +0000 UTC m=+1366.087534811" watchObservedRunningTime="2026-03-21 04:46:01.763888061 +0000 UTC m=+1366.091674737" Mar 21 04:46:02 crc kubenswrapper[4839]: I0321 04:46:02.796742 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6","Type":"ContainerStarted","Data":"1e51df175e72743e7d699f4e5fcec298f453f2659fd1cb0a4c210eba9115f1a3"} Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.062086 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.062483 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.099215 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.109708 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.805901 4839 generic.go:334] "Generic (PLEG): container finished" podID="75c1454e-0aed-48d9-a0f2-f7c2797156ce" containerID="66cb92ff47a88ccd93ffde6b9853588d4c4d5f3a25eb2c7a9862fbe6f8dc60f8" exitCode=0 Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.806022 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567806-g4rcl" event={"ID":"75c1454e-0aed-48d9-a0f2-f7c2797156ce","Type":"ContainerDied","Data":"66cb92ff47a88ccd93ffde6b9853588d4c4d5f3a25eb2c7a9862fbe6f8dc60f8"} Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.809205 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6","Type":"ContainerStarted","Data":"1ebe513656b6f58bbf6f0d69227894541ab6a4fa4cbe47f5b1af5f7551f5352e"} Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.809689 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.809727 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.809738 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 21 04:46:03 crc kubenswrapper[4839]: I0321 04:46:03.852035 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.3344914279999998 podStartE2EDuration="9.852013202s" podCreationTimestamp="2026-03-21 04:45:54 +0000 UTC" firstStartedPulling="2026-03-21 04:45:55.756017906 +0000 UTC m=+1360.083804582" lastFinishedPulling="2026-03-21 04:46:03.27353968 +0000 UTC m=+1367.601326356" observedRunningTime="2026-03-21 04:46:03.851178389 +0000 UTC m=+1368.178965085" watchObservedRunningTime="2026-03-21 04:46:03.852013202 +0000 UTC m=+1368.179799878" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.153909 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567806-g4rcl" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.267009 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88ds5\" (UniqueName: \"kubernetes.io/projected/75c1454e-0aed-48d9-a0f2-f7c2797156ce-kube-api-access-88ds5\") pod \"75c1454e-0aed-48d9-a0f2-f7c2797156ce\" (UID: \"75c1454e-0aed-48d9-a0f2-f7c2797156ce\") " Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.282495 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75c1454e-0aed-48d9-a0f2-f7c2797156ce-kube-api-access-88ds5" (OuterVolumeSpecName: "kube-api-access-88ds5") pod "75c1454e-0aed-48d9-a0f2-f7c2797156ce" (UID: "75c1454e-0aed-48d9-a0f2-f7c2797156ce"). InnerVolumeSpecName "kube-api-access-88ds5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.370380 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88ds5\" (UniqueName: \"kubernetes.io/projected/75c1454e-0aed-48d9-a0f2-f7c2797156ce-kube-api-access-88ds5\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.395445 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.395501 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.429709 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.438714 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.793253 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.807538 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.836295 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567806-g4rcl" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.836532 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567806-g4rcl" event={"ID":"75c1454e-0aed-48d9-a0f2-f7c2797156ce","Type":"ContainerDied","Data":"bdf7c9c6ed03e29549d652d87f7937c2f76fe41b3839872da46b59ddd063bb8e"} Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.836561 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdf7c9c6ed03e29549d652d87f7937c2f76fe41b3839872da46b59ddd063bb8e" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.837376 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 21 04:46:05 crc kubenswrapper[4839]: I0321 04:46:05.837878 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 21 04:46:06 crc kubenswrapper[4839]: I0321 04:46:06.254235 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567800-hzcbk"] Mar 21 04:46:06 crc kubenswrapper[4839]: I0321 04:46:06.262511 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567800-hzcbk"] Mar 21 04:46:06 crc kubenswrapper[4839]: I0321 04:46:06.465419 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a2cd29b-967b-4cf6-9902-6f30ad049cb1" path="/var/lib/kubelet/pods/4a2cd29b-967b-4cf6-9902-6f30ad049cb1/volumes" Mar 21 04:46:07 crc kubenswrapper[4839]: I0321 04:46:07.841189 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 21 04:46:07 crc kubenswrapper[4839]: I0321 04:46:07.855928 4839 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 04:46:07 crc kubenswrapper[4839]: I0321 04:46:07.875129 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 21 04:46:17 crc kubenswrapper[4839]: I0321 04:46:17.937177 4839 generic.go:334] "Generic (PLEG): container finished" podID="bbaf057c-375e-4da6-a7cd-8c879a51ff50" containerID="a57d3dec4c234a21b088b3986b8d9a4b8012dec53cc26619ad9bdd0f9475d8cc" exitCode=0 Mar 21 04:46:17 crc kubenswrapper[4839]: I0321 04:46:17.937264 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5dvtr" event={"ID":"bbaf057c-375e-4da6-a7cd-8c879a51ff50","Type":"ContainerDied","Data":"a57d3dec4c234a21b088b3986b8d9a4b8012dec53cc26619ad9bdd0f9475d8cc"} Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.306415 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.428052 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmr76\" (UniqueName: \"kubernetes.io/projected/bbaf057c-375e-4da6-a7cd-8c879a51ff50-kube-api-access-kmr76\") pod \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.428138 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-config-data\") pod \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.428159 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-scripts\") pod \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.428289 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-combined-ca-bundle\") pod \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\" (UID: \"bbaf057c-375e-4da6-a7cd-8c879a51ff50\") " Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.436495 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbaf057c-375e-4da6-a7cd-8c879a51ff50-kube-api-access-kmr76" (OuterVolumeSpecName: "kube-api-access-kmr76") pod "bbaf057c-375e-4da6-a7cd-8c879a51ff50" (UID: "bbaf057c-375e-4da6-a7cd-8c879a51ff50"). InnerVolumeSpecName "kube-api-access-kmr76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.437186 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-scripts" (OuterVolumeSpecName: "scripts") pod "bbaf057c-375e-4da6-a7cd-8c879a51ff50" (UID: "bbaf057c-375e-4da6-a7cd-8c879a51ff50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.455834 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-config-data" (OuterVolumeSpecName: "config-data") pod "bbaf057c-375e-4da6-a7cd-8c879a51ff50" (UID: "bbaf057c-375e-4da6-a7cd-8c879a51ff50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.460768 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbaf057c-375e-4da6-a7cd-8c879a51ff50" (UID: "bbaf057c-375e-4da6-a7cd-8c879a51ff50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.530975 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmr76\" (UniqueName: \"kubernetes.io/projected/bbaf057c-375e-4da6-a7cd-8c879a51ff50-kube-api-access-kmr76\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.531113 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.531131 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.531144 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbaf057c-375e-4da6-a7cd-8c879a51ff50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.959340 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5dvtr" event={"ID":"bbaf057c-375e-4da6-a7cd-8c879a51ff50","Type":"ContainerDied","Data":"167a9bd8f5cfd7d579c6c62283502e438b5ae393020828f4bac8087f747ad53c"} Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.959386 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="167a9bd8f5cfd7d579c6c62283502e438b5ae393020828f4bac8087f747ad53c" Mar 21 04:46:19 crc kubenswrapper[4839]: I0321 04:46:19.959452 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5dvtr" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.063299 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 21 04:46:20 crc kubenswrapper[4839]: E0321 04:46:20.064009 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbaf057c-375e-4da6-a7cd-8c879a51ff50" containerName="nova-cell0-conductor-db-sync" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.064029 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbaf057c-375e-4da6-a7cd-8c879a51ff50" containerName="nova-cell0-conductor-db-sync" Mar 21 04:46:20 crc kubenswrapper[4839]: E0321 04:46:20.064052 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75c1454e-0aed-48d9-a0f2-f7c2797156ce" containerName="oc" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.064059 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="75c1454e-0aed-48d9-a0f2-f7c2797156ce" containerName="oc" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.064253 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbaf057c-375e-4da6-a7cd-8c879a51ff50" containerName="nova-cell0-conductor-db-sync" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.064281 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="75c1454e-0aed-48d9-a0f2-f7c2797156ce" containerName="oc" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.064965 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.067464 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-t66x4" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.067666 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.075610 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.142282 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152d0351-12d2-4cf1-ad49-fd943b223442-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"152d0351-12d2-4cf1-ad49-fd943b223442\") " pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.142363 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152d0351-12d2-4cf1-ad49-fd943b223442-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"152d0351-12d2-4cf1-ad49-fd943b223442\") " pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.142398 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt2lq\" (UniqueName: \"kubernetes.io/projected/152d0351-12d2-4cf1-ad49-fd943b223442-kube-api-access-dt2lq\") pod \"nova-cell0-conductor-0\" (UID: \"152d0351-12d2-4cf1-ad49-fd943b223442\") " pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.244354 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152d0351-12d2-4cf1-ad49-fd943b223442-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"152d0351-12d2-4cf1-ad49-fd943b223442\") " pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.244423 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt2lq\" (UniqueName: \"kubernetes.io/projected/152d0351-12d2-4cf1-ad49-fd943b223442-kube-api-access-dt2lq\") pod \"nova-cell0-conductor-0\" (UID: \"152d0351-12d2-4cf1-ad49-fd943b223442\") " pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.244594 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152d0351-12d2-4cf1-ad49-fd943b223442-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"152d0351-12d2-4cf1-ad49-fd943b223442\") " pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.249868 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152d0351-12d2-4cf1-ad49-fd943b223442-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"152d0351-12d2-4cf1-ad49-fd943b223442\") " pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.253366 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/152d0351-12d2-4cf1-ad49-fd943b223442-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"152d0351-12d2-4cf1-ad49-fd943b223442\") " pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.264276 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt2lq\" (UniqueName: \"kubernetes.io/projected/152d0351-12d2-4cf1-ad49-fd943b223442-kube-api-access-dt2lq\") pod \"nova-cell0-conductor-0\" (UID: \"152d0351-12d2-4cf1-ad49-fd943b223442\") " pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.383376 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.819891 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 21 04:46:20 crc kubenswrapper[4839]: W0321 04:46:20.830445 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod152d0351_12d2_4cf1_ad49_fd943b223442.slice/crio-f05bbac8b138a0f5d634f17f9935fc2d124e9f923454d943a82c4716905ac60f WatchSource:0}: Error finding container f05bbac8b138a0f5d634f17f9935fc2d124e9f923454d943a82c4716905ac60f: Status 404 returned error can't find the container with id f05bbac8b138a0f5d634f17f9935fc2d124e9f923454d943a82c4716905ac60f Mar 21 04:46:20 crc kubenswrapper[4839]: I0321 04:46:20.968850 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"152d0351-12d2-4cf1-ad49-fd943b223442","Type":"ContainerStarted","Data":"f05bbac8b138a0f5d634f17f9935fc2d124e9f923454d943a82c4716905ac60f"} Mar 21 04:46:21 crc kubenswrapper[4839]: I0321 04:46:21.982598 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"152d0351-12d2-4cf1-ad49-fd943b223442","Type":"ContainerStarted","Data":"561a4d809fd1b08d12fca4778c4c1c7d41e351f065cfb7df8cce030baf49ce86"} Mar 21 04:46:21 crc kubenswrapper[4839]: I0321 04:46:21.982796 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:22 crc kubenswrapper[4839]: I0321 04:46:22.009895 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.009866404 podStartE2EDuration="2.009866404s" podCreationTimestamp="2026-03-21 04:46:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:46:21.999869792 +0000 UTC m=+1386.327656528" watchObservedRunningTime="2026-03-21 04:46:22.009866404 +0000 UTC m=+1386.337653120" Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.169448 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.413557 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.922423 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-csj7l"] Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.923878 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.925513 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.925761 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.933226 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-csj7l"] Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.975986 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-config-data\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.976061 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wj7k\" (UniqueName: \"kubernetes.io/projected/37c6fbf7-427d-45a8-b190-439265c8d6d0-kube-api-access-5wj7k\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.976185 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-scripts\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:25 crc kubenswrapper[4839]: I0321 04:46:25.976226 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.064868 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.065939 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.068471 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.077958 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-config-data\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.078015 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wj7k\" (UniqueName: \"kubernetes.io/projected/37c6fbf7-427d-45a8-b190-439265c8d6d0-kube-api-access-5wj7k\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.078083 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-scripts\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.078105 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.084357 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.091233 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.101160 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-scripts\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.102552 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-config-data\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.135389 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wj7k\" (UniqueName: \"kubernetes.io/projected/37c6fbf7-427d-45a8-b190-439265c8d6d0-kube-api-access-5wj7k\") pod \"nova-cell0-cell-mapping-csj7l\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.181771 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.181929 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-config-data\") pod \"nova-scheduler-0\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.181990 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ww4d\" (UniqueName: \"kubernetes.io/projected/62694a5a-1565-4831-bff3-504a782692bb-kube-api-access-6ww4d\") pod \"nova-scheduler-0\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.191384 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.229345 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.234142 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.238052 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.239550 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.248437 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.259688 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.274628 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.290317 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-config-data\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.290395 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.290457 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dwgm\" (UniqueName: \"kubernetes.io/projected/205b5c5e-c09f-4b4a-8a56-f98531ad0125-kube-api-access-2dwgm\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.290689 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-config-data\") pod \"nova-scheduler-0\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.290860 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205b5c5e-c09f-4b4a-8a56-f98531ad0125-logs\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.290927 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ww4d\" (UniqueName: \"kubernetes.io/projected/62694a5a-1565-4831-bff3-504a782692bb-kube-api-access-6ww4d\") pod \"nova-scheduler-0\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.291163 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.310641 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.316338 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-config-data\") pod \"nova-scheduler-0\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.327765 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.330267 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.331337 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.333998 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.340134 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ww4d\" (UniqueName: \"kubernetes.io/projected/62694a5a-1565-4831-bff3-504a782692bb-kube-api-access-6ww4d\") pod \"nova-scheduler-0\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.386524 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.393344 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.393602 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xbgl\" (UniqueName: \"kubernetes.io/projected/3307932f-5c67-4abb-9649-e4b3a0a19e9c-kube-api-access-5xbgl\") pod \"nova-cell1-novncproxy-0\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.394033 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zk4h\" (UniqueName: \"kubernetes.io/projected/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-kube-api-access-4zk4h\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.394146 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.394250 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205b5c5e-c09f-4b4a-8a56-f98531ad0125-logs\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.394374 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.394511 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-logs\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.394633 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-config-data\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.394739 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.394831 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dwgm\" (UniqueName: \"kubernetes.io/projected/205b5c5e-c09f-4b4a-8a56-f98531ad0125-kube-api-access-2dwgm\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.394918 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-config-data\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.400369 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.400819 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205b5c5e-c09f-4b4a-8a56-f98531ad0125-logs\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.421211 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dwgm\" (UniqueName: \"kubernetes.io/projected/205b5c5e-c09f-4b4a-8a56-f98531ad0125-kube-api-access-2dwgm\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.421274 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-gwlp7"] Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.422719 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.430264 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-gwlp7"] Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.466892 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-config-data\") pod \"nova-metadata-0\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.497991 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498051 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xbgl\" (UniqueName: \"kubernetes.io/projected/3307932f-5c67-4abb-9649-e4b3a0a19e9c-kube-api-access-5xbgl\") pod \"nova-cell1-novncproxy-0\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498126 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498160 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zk4h\" (UniqueName: \"kubernetes.io/projected/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-kube-api-access-4zk4h\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498182 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498215 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498265 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498289 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-config\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498350 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498405 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-svc\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498428 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-logs\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498455 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c6c4\" (UniqueName: \"kubernetes.io/projected/378a796b-e896-48a8-9e03-65e3b371c636-kube-api-access-4c6c4\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.498530 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-config-data\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.502627 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-logs\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.503478 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-config-data\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.506385 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.506938 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.509731 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.516872 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.528048 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zk4h\" (UniqueName: \"kubernetes.io/projected/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-kube-api-access-4zk4h\") pod \"nova-api-0\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.531095 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xbgl\" (UniqueName: \"kubernetes.io/projected/3307932f-5c67-4abb-9649-e4b3a0a19e9c-kube-api-access-5xbgl\") pod \"nova-cell1-novncproxy-0\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.539706 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.600783 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.600895 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.600954 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-config\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.601023 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.601222 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-svc\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.601265 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c6c4\" (UniqueName: \"kubernetes.io/projected/378a796b-e896-48a8-9e03-65e3b371c636-kube-api-access-4c6c4\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.603089 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.603421 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.604008 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.604610 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-svc\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.604859 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-config\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.624037 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c6c4\" (UniqueName: \"kubernetes.io/projected/378a796b-e896-48a8-9e03-65e3b371c636-kube-api-access-4c6c4\") pod \"dnsmasq-dns-bccf8f775-gwlp7\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.696703 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.827065 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.849761 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:26 crc kubenswrapper[4839]: I0321 04:46:26.862345 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-csj7l"] Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.082702 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-csj7l" event={"ID":"37c6fbf7-427d-45a8-b190-439265c8d6d0","Type":"ContainerStarted","Data":"deb2bfcfde4895ebb1cae51b1ec4d964acc93bf15dd9d8c3a0d4a4811a853624"} Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.109126 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:46:27 crc kubenswrapper[4839]: W0321 04:46:27.118813 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60e89b8f_2e6a_43a6_a9de_162a457cc5fb.slice/crio-578df92eb43a9d3deb754fc3112c79ab2340cf1a5936b7d1362d0e02e009882d WatchSource:0}: Error finding container 578df92eb43a9d3deb754fc3112c79ab2340cf1a5936b7d1362d0e02e009882d: Status 404 returned error can't find the container with id 578df92eb43a9d3deb754fc3112c79ab2340cf1a5936b7d1362d0e02e009882d Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.184960 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.278902 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jznl6"] Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.280796 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.286803 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.286990 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.289710 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jznl6"] Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.326501 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:46:27 crc kubenswrapper[4839]: W0321 04:46:27.356698 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod205b5c5e_c09f_4b4a_8a56_f98531ad0125.slice/crio-5792eedab352d18af4b7af67287b836849e3b15e2d915a2161d15245e06868bd WatchSource:0}: Error finding container 5792eedab352d18af4b7af67287b836849e3b15e2d915a2161d15245e06868bd: Status 404 returned error can't find the container with id 5792eedab352d18af4b7af67287b836849e3b15e2d915a2161d15245e06868bd Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.416637 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.416783 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rrz9\" (UniqueName: \"kubernetes.io/projected/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-kube-api-access-6rrz9\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.416802 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-config-data\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.416837 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-scripts\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.518235 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rrz9\" (UniqueName: \"kubernetes.io/projected/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-kube-api-access-6rrz9\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.518605 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-config-data\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.518738 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-scripts\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.518870 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.526223 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-config-data\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.533998 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.535415 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-scripts\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.545839 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rrz9\" (UniqueName: \"kubernetes.io/projected/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-kube-api-access-6rrz9\") pod \"nova-cell1-conductor-db-sync-jznl6\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.556691 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 04:46:27 crc kubenswrapper[4839]: W0321 04:46:27.578303 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod378a796b_e896_48a8_9e03_65e3b371c636.slice/crio-49d3afde602a166e8c5a9ef71743020fdf1f738c3940d641e7dae2434ec0eb13 WatchSource:0}: Error finding container 49d3afde602a166e8c5a9ef71743020fdf1f738c3940d641e7dae2434ec0eb13: Status 404 returned error can't find the container with id 49d3afde602a166e8c5a9ef71743020fdf1f738c3940d641e7dae2434ec0eb13 Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.597786 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-gwlp7"] Mar 21 04:46:27 crc kubenswrapper[4839]: I0321 04:46:27.647662 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:28 crc kubenswrapper[4839]: I0321 04:46:28.099892 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60e89b8f-2e6a-43a6-a9de-162a457cc5fb","Type":"ContainerStarted","Data":"578df92eb43a9d3deb754fc3112c79ab2340cf1a5936b7d1362d0e02e009882d"} Mar 21 04:46:28 crc kubenswrapper[4839]: I0321 04:46:28.116917 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"205b5c5e-c09f-4b4a-8a56-f98531ad0125","Type":"ContainerStarted","Data":"5792eedab352d18af4b7af67287b836849e3b15e2d915a2161d15245e06868bd"} Mar 21 04:46:28 crc kubenswrapper[4839]: I0321 04:46:28.135517 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3307932f-5c67-4abb-9649-e4b3a0a19e9c","Type":"ContainerStarted","Data":"51088d3062d2b2360e5c4a54fc629b5fbeeafa49a6e356a501876595e528c519"} Mar 21 04:46:28 crc kubenswrapper[4839]: I0321 04:46:28.156111 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-csj7l" event={"ID":"37c6fbf7-427d-45a8-b190-439265c8d6d0","Type":"ContainerStarted","Data":"6500e5c41c0724032a37daabaaadca5a2ab96ab0732aaceeaaccdf5e739d902c"} Mar 21 04:46:28 crc kubenswrapper[4839]: I0321 04:46:28.164237 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"62694a5a-1565-4831-bff3-504a782692bb","Type":"ContainerStarted","Data":"0b486a0ca8e515cbd2ddc4f12af8c937feaf1c88976b8dbba2cd271361b2775c"} Mar 21 04:46:28 crc kubenswrapper[4839]: I0321 04:46:28.175454 4839 generic.go:334] "Generic (PLEG): container finished" podID="378a796b-e896-48a8-9e03-65e3b371c636" containerID="880297fb77f65981125f101cde38f55dd95860faac6dbd936272889d1aa0b1aa" exitCode=0 Mar 21 04:46:28 crc kubenswrapper[4839]: I0321 04:46:28.175499 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" event={"ID":"378a796b-e896-48a8-9e03-65e3b371c636","Type":"ContainerDied","Data":"880297fb77f65981125f101cde38f55dd95860faac6dbd936272889d1aa0b1aa"} Mar 21 04:46:28 crc kubenswrapper[4839]: I0321 04:46:28.175523 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" event={"ID":"378a796b-e896-48a8-9e03-65e3b371c636","Type":"ContainerStarted","Data":"49d3afde602a166e8c5a9ef71743020fdf1f738c3940d641e7dae2434ec0eb13"} Mar 21 04:46:28 crc kubenswrapper[4839]: I0321 04:46:28.206081 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-csj7l" podStartSLOduration=3.206060108 podStartE2EDuration="3.206060108s" podCreationTimestamp="2026-03-21 04:46:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:46:28.176499486 +0000 UTC m=+1392.504286162" watchObservedRunningTime="2026-03-21 04:46:28.206060108 +0000 UTC m=+1392.533846774" Mar 21 04:46:28 crc kubenswrapper[4839]: I0321 04:46:28.266831 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jznl6"] Mar 21 04:46:29 crc kubenswrapper[4839]: I0321 04:46:29.202394 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jznl6" event={"ID":"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d","Type":"ContainerStarted","Data":"118f2c293ce181a9defa7eb0621b40d7a4ec32e8ea91c36b0f98ccebfdd6ba13"} Mar 21 04:46:29 crc kubenswrapper[4839]: I0321 04:46:29.202789 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jznl6" event={"ID":"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d","Type":"ContainerStarted","Data":"adab925016451244fae9f2cf83f23ed7b20a7f3728fde316dcb382033aa897aa"} Mar 21 04:46:29 crc kubenswrapper[4839]: I0321 04:46:29.211713 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" event={"ID":"378a796b-e896-48a8-9e03-65e3b371c636","Type":"ContainerStarted","Data":"d33f1cdd73480cf38d5a67e559fe413c35de9b47b49b6298618c23ca1c61bfaa"} Mar 21 04:46:29 crc kubenswrapper[4839]: I0321 04:46:29.211769 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:29 crc kubenswrapper[4839]: I0321 04:46:29.223902 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-jznl6" podStartSLOduration=2.223877025 podStartE2EDuration="2.223877025s" podCreationTimestamp="2026-03-21 04:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:46:29.221243361 +0000 UTC m=+1393.549030037" watchObservedRunningTime="2026-03-21 04:46:29.223877025 +0000 UTC m=+1393.551663711" Mar 21 04:46:29 crc kubenswrapper[4839]: I0321 04:46:29.257358 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" podStartSLOduration=3.257339287 podStartE2EDuration="3.257339287s" podCreationTimestamp="2026-03-21 04:46:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:46:29.250583167 +0000 UTC m=+1393.578369853" watchObservedRunningTime="2026-03-21 04:46:29.257339287 +0000 UTC m=+1393.585125963" Mar 21 04:46:30 crc kubenswrapper[4839]: I0321 04:46:30.182769 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 04:46:30 crc kubenswrapper[4839]: I0321 04:46:30.224978 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.254099 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3307932f-5c67-4abb-9649-e4b3a0a19e9c","Type":"ContainerStarted","Data":"09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae"} Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.254152 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="3307932f-5c67-4abb-9649-e4b3a0a19e9c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae" gracePeriod=30 Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.256773 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"62694a5a-1565-4831-bff3-504a782692bb","Type":"ContainerStarted","Data":"8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32"} Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.263421 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60e89b8f-2e6a-43a6-a9de-162a457cc5fb","Type":"ContainerStarted","Data":"7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c"} Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.263464 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60e89b8f-2e6a-43a6-a9de-162a457cc5fb","Type":"ContainerStarted","Data":"226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0"} Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.266029 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"205b5c5e-c09f-4b4a-8a56-f98531ad0125","Type":"ContainerStarted","Data":"1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea"} Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.266063 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"205b5c5e-c09f-4b4a-8a56-f98531ad0125","Type":"ContainerStarted","Data":"3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c"} Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.266183 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" containerName="nova-metadata-log" containerID="cri-o://3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c" gracePeriod=30 Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.266461 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" containerName="nova-metadata-metadata" containerID="cri-o://1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea" gracePeriod=30 Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.281425 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.238367916 podStartE2EDuration="6.28140755s" podCreationTimestamp="2026-03-21 04:46:26 +0000 UTC" firstStartedPulling="2026-03-21 04:46:27.569247654 +0000 UTC m=+1391.897034340" lastFinishedPulling="2026-03-21 04:46:31.612287298 +0000 UTC m=+1395.940073974" observedRunningTime="2026-03-21 04:46:32.277301205 +0000 UTC m=+1396.605087881" watchObservedRunningTime="2026-03-21 04:46:32.28140755 +0000 UTC m=+1396.609194226" Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.301544 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.919290498 podStartE2EDuration="6.301527947s" podCreationTimestamp="2026-03-21 04:46:26 +0000 UTC" firstStartedPulling="2026-03-21 04:46:27.230171392 +0000 UTC m=+1391.557958078" lastFinishedPulling="2026-03-21 04:46:31.612408851 +0000 UTC m=+1395.940195527" observedRunningTime="2026-03-21 04:46:32.298323167 +0000 UTC m=+1396.626109853" watchObservedRunningTime="2026-03-21 04:46:32.301527947 +0000 UTC m=+1396.629314623" Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.330411 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.080120854 podStartE2EDuration="6.330388379s" podCreationTimestamp="2026-03-21 04:46:26 +0000 UTC" firstStartedPulling="2026-03-21 04:46:27.362099815 +0000 UTC m=+1391.689886491" lastFinishedPulling="2026-03-21 04:46:31.61236734 +0000 UTC m=+1395.940154016" observedRunningTime="2026-03-21 04:46:32.324765701 +0000 UTC m=+1396.652552377" watchObservedRunningTime="2026-03-21 04:46:32.330388379 +0000 UTC m=+1396.658175055" Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.372076 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.8874762330000001 podStartE2EDuration="6.372047712s" podCreationTimestamp="2026-03-21 04:46:26 +0000 UTC" firstStartedPulling="2026-03-21 04:46:27.132924155 +0000 UTC m=+1391.460710831" lastFinishedPulling="2026-03-21 04:46:31.617495634 +0000 UTC m=+1395.945282310" observedRunningTime="2026-03-21 04:46:32.351222955 +0000 UTC m=+1396.679009631" watchObservedRunningTime="2026-03-21 04:46:32.372047712 +0000 UTC m=+1396.699834398" Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.542399 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 04:46:32 crc kubenswrapper[4839]: I0321 04:46:32.542670 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="76b8f1b8-aa66-4f5e-937a-f837a2da28f1" containerName="kube-state-metrics" containerID="cri-o://ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b" gracePeriod=30 Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.024049 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.171376 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr9w9\" (UniqueName: \"kubernetes.io/projected/76b8f1b8-aa66-4f5e-937a-f837a2da28f1-kube-api-access-fr9w9\") pod \"76b8f1b8-aa66-4f5e-937a-f837a2da28f1\" (UID: \"76b8f1b8-aa66-4f5e-937a-f837a2da28f1\") " Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.178088 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76b8f1b8-aa66-4f5e-937a-f837a2da28f1-kube-api-access-fr9w9" (OuterVolumeSpecName: "kube-api-access-fr9w9") pod "76b8f1b8-aa66-4f5e-937a-f837a2da28f1" (UID: "76b8f1b8-aa66-4f5e-937a-f837a2da28f1"). InnerVolumeSpecName "kube-api-access-fr9w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.273184 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr9w9\" (UniqueName: \"kubernetes.io/projected/76b8f1b8-aa66-4f5e-937a-f837a2da28f1-kube-api-access-fr9w9\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.276657 4839 generic.go:334] "Generic (PLEG): container finished" podID="76b8f1b8-aa66-4f5e-937a-f837a2da28f1" containerID="ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b" exitCode=2 Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.276727 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.276717 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"76b8f1b8-aa66-4f5e-937a-f837a2da28f1","Type":"ContainerDied","Data":"ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b"} Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.276870 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"76b8f1b8-aa66-4f5e-937a-f837a2da28f1","Type":"ContainerDied","Data":"c3e02332eed0f6ac50479a637c2f9551186161a99dab978e61007f6da0cf9aba"} Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.276893 4839 scope.go:117] "RemoveContainer" containerID="ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.279234 4839 generic.go:334] "Generic (PLEG): container finished" podID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" containerID="3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c" exitCode=143 Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.279313 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"205b5c5e-c09f-4b4a-8a56-f98531ad0125","Type":"ContainerDied","Data":"3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c"} Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.297416 4839 scope.go:117] "RemoveContainer" containerID="ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b" Mar 21 04:46:33 crc kubenswrapper[4839]: E0321 04:46:33.297860 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b\": container with ID starting with ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b not found: ID does not exist" containerID="ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.297911 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b"} err="failed to get container status \"ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b\": rpc error: code = NotFound desc = could not find container \"ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b\": container with ID starting with ad857e3bd5d30310b9baab625e84f950c629d3216d303b1cad7b0178fb62aa6b not found: ID does not exist" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.314348 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.323500 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.339021 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 04:46:33 crc kubenswrapper[4839]: E0321 04:46:33.340711 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76b8f1b8-aa66-4f5e-937a-f837a2da28f1" containerName="kube-state-metrics" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.340747 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="76b8f1b8-aa66-4f5e-937a-f837a2da28f1" containerName="kube-state-metrics" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.341623 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="76b8f1b8-aa66-4f5e-937a-f837a2da28f1" containerName="kube-state-metrics" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.342739 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.348098 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.348496 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.389927 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.481226 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1626316f-b029-4424-b783-25eeb2790eb2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.481307 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1626316f-b029-4424-b783-25eeb2790eb2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.481348 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1626316f-b029-4424-b783-25eeb2790eb2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.481377 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpthn\" (UniqueName: \"kubernetes.io/projected/1626316f-b029-4424-b783-25eeb2790eb2-kube-api-access-zpthn\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.583154 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpthn\" (UniqueName: \"kubernetes.io/projected/1626316f-b029-4424-b783-25eeb2790eb2-kube-api-access-zpthn\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.583451 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1626316f-b029-4424-b783-25eeb2790eb2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.583580 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1626316f-b029-4424-b783-25eeb2790eb2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.583672 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1626316f-b029-4424-b783-25eeb2790eb2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.590285 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1626316f-b029-4424-b783-25eeb2790eb2-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.592925 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1626316f-b029-4424-b783-25eeb2790eb2-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.601788 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpthn\" (UniqueName: \"kubernetes.io/projected/1626316f-b029-4424-b783-25eeb2790eb2-kube-api-access-zpthn\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.604330 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1626316f-b029-4424-b783-25eeb2790eb2-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1626316f-b029-4424-b783-25eeb2790eb2\") " pod="openstack/kube-state-metrics-0" Mar 21 04:46:33 crc kubenswrapper[4839]: I0321 04:46:33.666802 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 21 04:46:34 crc kubenswrapper[4839]: I0321 04:46:34.154234 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 21 04:46:34 crc kubenswrapper[4839]: I0321 04:46:34.291606 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1626316f-b029-4424-b783-25eeb2790eb2","Type":"ContainerStarted","Data":"ba925efacaf027591d0a4dc124094b8564705b768ee6312022862c157381c653"} Mar 21 04:46:34 crc kubenswrapper[4839]: I0321 04:46:34.462938 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76b8f1b8-aa66-4f5e-937a-f837a2da28f1" path="/var/lib/kubelet/pods/76b8f1b8-aa66-4f5e-937a-f837a2da28f1/volumes" Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.051248 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.052098 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="ceilometer-central-agent" containerID="cri-o://cbba2b10323381d9b303a23b6607bd17c5906d7437bb21c852b760d41642da03" gracePeriod=30 Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.052313 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="proxy-httpd" containerID="cri-o://1ebe513656b6f58bbf6f0d69227894541ab6a4fa4cbe47f5b1af5f7551f5352e" gracePeriod=30 Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.052369 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="ceilometer-notification-agent" containerID="cri-o://1c072f26f9d106f6161eb66b3f6c9a76cc9db2ce4ede2775c0501c104b78c25c" gracePeriod=30 Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.052317 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="sg-core" containerID="cri-o://1e51df175e72743e7d699f4e5fcec298f453f2659fd1cb0a4c210eba9115f1a3" gracePeriod=30 Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.303627 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1626316f-b029-4424-b783-25eeb2790eb2","Type":"ContainerStarted","Data":"ac03578aadfbad30195691a7bbe3beec9a22dc9381538c0c5f6b5f64c58f29b2"} Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.303786 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.307369 4839 generic.go:334] "Generic (PLEG): container finished" podID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerID="1ebe513656b6f58bbf6f0d69227894541ab6a4fa4cbe47f5b1af5f7551f5352e" exitCode=0 Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.307436 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6","Type":"ContainerDied","Data":"1ebe513656b6f58bbf6f0d69227894541ab6a4fa4cbe47f5b1af5f7551f5352e"} Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.307476 4839 generic.go:334] "Generic (PLEG): container finished" podID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerID="1e51df175e72743e7d699f4e5fcec298f453f2659fd1cb0a4c210eba9115f1a3" exitCode=2 Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.307499 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6","Type":"ContainerDied","Data":"1e51df175e72743e7d699f4e5fcec298f453f2659fd1cb0a4c210eba9115f1a3"} Mar 21 04:46:35 crc kubenswrapper[4839]: I0321 04:46:35.338091 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.961919914 podStartE2EDuration="2.338074961s" podCreationTimestamp="2026-03-21 04:46:33 +0000 UTC" firstStartedPulling="2026-03-21 04:46:34.157732951 +0000 UTC m=+1398.485519627" lastFinishedPulling="2026-03-21 04:46:34.533887998 +0000 UTC m=+1398.861674674" observedRunningTime="2026-03-21 04:46:35.334615714 +0000 UTC m=+1399.662402390" watchObservedRunningTime="2026-03-21 04:46:35.338074961 +0000 UTC m=+1399.665861637" Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.320072 4839 generic.go:334] "Generic (PLEG): container finished" podID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerID="cbba2b10323381d9b303a23b6607bd17c5906d7437bb21c852b760d41642da03" exitCode=0 Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.320156 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6","Type":"ContainerDied","Data":"cbba2b10323381d9b303a23b6607bd17c5906d7437bb21c852b760d41642da03"} Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.325005 4839 generic.go:334] "Generic (PLEG): container finished" podID="37c6fbf7-427d-45a8-b190-439265c8d6d0" containerID="6500e5c41c0724032a37daabaaadca5a2ab96ab0732aaceeaaccdf5e739d902c" exitCode=0 Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.325110 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-csj7l" event={"ID":"37c6fbf7-427d-45a8-b190-439265c8d6d0","Type":"ContainerDied","Data":"6500e5c41c0724032a37daabaaadca5a2ab96ab0732aaceeaaccdf5e739d902c"} Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.517815 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.518122 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.540422 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.540466 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.554562 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.828533 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.852444 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:46:36 crc kubenswrapper[4839]: I0321 04:46:36.939402 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-qc28r"] Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.335085 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" podUID="439bd408-2f5c-45cc-a2f7-8166a4a279c2" containerName="dnsmasq-dns" containerID="cri-o://c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18" gracePeriod=10 Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.370134 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.624803 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.624859 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.868224 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.968079 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wj7k\" (UniqueName: \"kubernetes.io/projected/37c6fbf7-427d-45a8-b190-439265c8d6d0-kube-api-access-5wj7k\") pod \"37c6fbf7-427d-45a8-b190-439265c8d6d0\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.968243 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-scripts\") pod \"37c6fbf7-427d-45a8-b190-439265c8d6d0\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.968273 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-config-data\") pod \"37c6fbf7-427d-45a8-b190-439265c8d6d0\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.968345 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-combined-ca-bundle\") pod \"37c6fbf7-427d-45a8-b190-439265c8d6d0\" (UID: \"37c6fbf7-427d-45a8-b190-439265c8d6d0\") " Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.998387 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-scripts" (OuterVolumeSpecName: "scripts") pod "37c6fbf7-427d-45a8-b190-439265c8d6d0" (UID: "37c6fbf7-427d-45a8-b190-439265c8d6d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:37 crc kubenswrapper[4839]: I0321 04:46:37.999514 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37c6fbf7-427d-45a8-b190-439265c8d6d0-kube-api-access-5wj7k" (OuterVolumeSpecName: "kube-api-access-5wj7k") pod "37c6fbf7-427d-45a8-b190-439265c8d6d0" (UID: "37c6fbf7-427d-45a8-b190-439265c8d6d0"). InnerVolumeSpecName "kube-api-access-5wj7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.040773 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37c6fbf7-427d-45a8-b190-439265c8d6d0" (UID: "37c6fbf7-427d-45a8-b190-439265c8d6d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.052093 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-config-data" (OuterVolumeSpecName: "config-data") pod "37c6fbf7-427d-45a8-b190-439265c8d6d0" (UID: "37c6fbf7-427d-45a8-b190-439265c8d6d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.070213 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wj7k\" (UniqueName: \"kubernetes.io/projected/37c6fbf7-427d-45a8-b190-439265c8d6d0-kube-api-access-5wj7k\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.070256 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.070269 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.070280 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c6fbf7-427d-45a8-b190-439265c8d6d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.115526 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.286733 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-sb\") pod \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.286794 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-nb\") pod \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.286822 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-svc\") pod \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.286852 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wj7c\" (UniqueName: \"kubernetes.io/projected/439bd408-2f5c-45cc-a2f7-8166a4a279c2-kube-api-access-4wj7c\") pod \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.286906 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-config\") pod \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.286937 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-swift-storage-0\") pod \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\" (UID: \"439bd408-2f5c-45cc-a2f7-8166a4a279c2\") " Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.298073 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/439bd408-2f5c-45cc-a2f7-8166a4a279c2-kube-api-access-4wj7c" (OuterVolumeSpecName: "kube-api-access-4wj7c") pod "439bd408-2f5c-45cc-a2f7-8166a4a279c2" (UID: "439bd408-2f5c-45cc-a2f7-8166a4a279c2"). InnerVolumeSpecName "kube-api-access-4wj7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.346551 4839 generic.go:334] "Generic (PLEG): container finished" podID="439bd408-2f5c-45cc-a2f7-8166a4a279c2" containerID="c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18" exitCode=0 Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.346633 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" event={"ID":"439bd408-2f5c-45cc-a2f7-8166a4a279c2","Type":"ContainerDied","Data":"c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18"} Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.346686 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" event={"ID":"439bd408-2f5c-45cc-a2f7-8166a4a279c2","Type":"ContainerDied","Data":"21228341591d8e5aec6ec7937412b30e803ee3d8f05a2c9720bd304ab86d36ca"} Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.346687 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-qc28r" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.346706 4839 scope.go:117] "RemoveContainer" containerID="c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.349274 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-csj7l" event={"ID":"37c6fbf7-427d-45a8-b190-439265c8d6d0","Type":"ContainerDied","Data":"deb2bfcfde4895ebb1cae51b1ec4d964acc93bf15dd9d8c3a0d4a4811a853624"} Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.349352 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deb2bfcfde4895ebb1cae51b1ec4d964acc93bf15dd9d8c3a0d4a4811a853624" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.349463 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-csj7l" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.350220 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "439bd408-2f5c-45cc-a2f7-8166a4a279c2" (UID: "439bd408-2f5c-45cc-a2f7-8166a4a279c2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.354421 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-config" (OuterVolumeSpecName: "config") pod "439bd408-2f5c-45cc-a2f7-8166a4a279c2" (UID: "439bd408-2f5c-45cc-a2f7-8166a4a279c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.358911 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "439bd408-2f5c-45cc-a2f7-8166a4a279c2" (UID: "439bd408-2f5c-45cc-a2f7-8166a4a279c2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.368824 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "439bd408-2f5c-45cc-a2f7-8166a4a279c2" (UID: "439bd408-2f5c-45cc-a2f7-8166a4a279c2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.374166 4839 scope.go:117] "RemoveContainer" containerID="583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.382487 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "439bd408-2f5c-45cc-a2f7-8166a4a279c2" (UID: "439bd408-2f5c-45cc-a2f7-8166a4a279c2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.389525 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.389558 4839 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.389580 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.389589 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.389598 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/439bd408-2f5c-45cc-a2f7-8166a4a279c2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.389606 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wj7c\" (UniqueName: \"kubernetes.io/projected/439bd408-2f5c-45cc-a2f7-8166a4a279c2-kube-api-access-4wj7c\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.402516 4839 scope.go:117] "RemoveContainer" containerID="c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18" Mar 21 04:46:38 crc kubenswrapper[4839]: E0321 04:46:38.405220 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18\": container with ID starting with c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18 not found: ID does not exist" containerID="c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.405270 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18"} err="failed to get container status \"c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18\": rpc error: code = NotFound desc = could not find container \"c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18\": container with ID starting with c21a3fe620f5493aafd1cd6d7aa74d463dea1dbb4165e2e674ec4728c83d9d18 not found: ID does not exist" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.405299 4839 scope.go:117] "RemoveContainer" containerID="583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b" Mar 21 04:46:38 crc kubenswrapper[4839]: E0321 04:46:38.405591 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b\": container with ID starting with 583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b not found: ID does not exist" containerID="583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.405620 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b"} err="failed to get container status \"583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b\": rpc error: code = NotFound desc = could not find container \"583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b\": container with ID starting with 583f06ca5d49ecf29149b9c8c616f5732714a4739948ee020d19d827d943128b not found: ID does not exist" Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.464241 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.464444 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerName="nova-api-log" containerID="cri-o://226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0" gracePeriod=30 Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.465135 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerName="nova-api-api" containerID="cri-o://7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c" gracePeriod=30 Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.670895 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-qc28r"] Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.677957 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-qc28r"] Mar 21 04:46:38 crc kubenswrapper[4839]: I0321 04:46:38.891442 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.361632 4839 generic.go:334] "Generic (PLEG): container finished" podID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerID="1c072f26f9d106f6161eb66b3f6c9a76cc9db2ce4ede2775c0501c104b78c25c" exitCode=0 Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.361699 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6","Type":"ContainerDied","Data":"1c072f26f9d106f6161eb66b3f6c9a76cc9db2ce4ede2775c0501c104b78c25c"} Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.366028 4839 generic.go:334] "Generic (PLEG): container finished" podID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerID="226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0" exitCode=143 Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.366272 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60e89b8f-2e6a-43a6-a9de-162a457cc5fb","Type":"ContainerDied","Data":"226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0"} Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.886990 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.947500 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-scripts\") pod \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.947549 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-run-httpd\") pod \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.947586 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-combined-ca-bundle\") pod \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.947662 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp7fg\" (UniqueName: \"kubernetes.io/projected/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-kube-api-access-cp7fg\") pod \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.947684 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-sg-core-conf-yaml\") pod \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.947767 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-log-httpd\") pod \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.947789 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-config-data\") pod \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\" (UID: \"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6\") " Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.948712 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" (UID: "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.949008 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" (UID: "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.955247 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-kube-api-access-cp7fg" (OuterVolumeSpecName: "kube-api-access-cp7fg") pod "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" (UID: "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6"). InnerVolumeSpecName "kube-api-access-cp7fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.958725 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-scripts" (OuterVolumeSpecName: "scripts") pod "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" (UID: "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:39 crc kubenswrapper[4839]: I0321 04:46:39.997782 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" (UID: "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.050194 4839 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.050238 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.050249 4839 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.050262 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp7fg\" (UniqueName: \"kubernetes.io/projected/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-kube-api-access-cp7fg\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.050275 4839 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.052500 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" (UID: "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.079106 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-config-data" (OuterVolumeSpecName: "config-data") pod "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" (UID: "8d47edf7-95e3-4bb5-ab87-27c9db5d05d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.151709 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.151758 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.401039 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8d47edf7-95e3-4bb5-ab87-27c9db5d05d6","Type":"ContainerDied","Data":"d75d884bbe4a31f233cb211086c9cf5693ae2ba4faf11ef4b832e01a60ea7483"} Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.401089 4839 scope.go:117] "RemoveContainer" containerID="1ebe513656b6f58bbf6f0d69227894541ab6a4fa4cbe47f5b1af5f7551f5352e" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.401207 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.408880 4839 generic.go:334] "Generic (PLEG): container finished" podID="500decd4-2b92-4e52-bfa8-bb8d1fe13b9d" containerID="118f2c293ce181a9defa7eb0621b40d7a4ec32e8ea91c36b0f98ccebfdd6ba13" exitCode=0 Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.409152 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jznl6" event={"ID":"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d","Type":"ContainerDied","Data":"118f2c293ce181a9defa7eb0621b40d7a4ec32e8ea91c36b0f98ccebfdd6ba13"} Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.409222 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="62694a5a-1565-4831-bff3-504a782692bb" containerName="nova-scheduler-scheduler" containerID="cri-o://8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32" gracePeriod=30 Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.424468 4839 scope.go:117] "RemoveContainer" containerID="1e51df175e72743e7d699f4e5fcec298f453f2659fd1cb0a4c210eba9115f1a3" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.468843 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="439bd408-2f5c-45cc-a2f7-8166a4a279c2" path="/var/lib/kubelet/pods/439bd408-2f5c-45cc-a2f7-8166a4a279c2/volumes" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.469667 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.469699 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.471008 4839 scope.go:117] "RemoveContainer" containerID="1c072f26f9d106f6161eb66b3f6c9a76cc9db2ce4ede2775c0501c104b78c25c" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.488682 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:46:40 crc kubenswrapper[4839]: E0321 04:46:40.489130 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439bd408-2f5c-45cc-a2f7-8166a4a279c2" containerName="init" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489149 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="439bd408-2f5c-45cc-a2f7-8166a4a279c2" containerName="init" Mar 21 04:46:40 crc kubenswrapper[4839]: E0321 04:46:40.489171 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="ceilometer-notification-agent" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489178 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="ceilometer-notification-agent" Mar 21 04:46:40 crc kubenswrapper[4839]: E0321 04:46:40.489189 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="ceilometer-central-agent" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489196 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="ceilometer-central-agent" Mar 21 04:46:40 crc kubenswrapper[4839]: E0321 04:46:40.489210 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439bd408-2f5c-45cc-a2f7-8166a4a279c2" containerName="dnsmasq-dns" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489216 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="439bd408-2f5c-45cc-a2f7-8166a4a279c2" containerName="dnsmasq-dns" Mar 21 04:46:40 crc kubenswrapper[4839]: E0321 04:46:40.489226 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37c6fbf7-427d-45a8-b190-439265c8d6d0" containerName="nova-manage" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489233 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="37c6fbf7-427d-45a8-b190-439265c8d6d0" containerName="nova-manage" Mar 21 04:46:40 crc kubenswrapper[4839]: E0321 04:46:40.489244 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="sg-core" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489249 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="sg-core" Mar 21 04:46:40 crc kubenswrapper[4839]: E0321 04:46:40.489265 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="proxy-httpd" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489270 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="proxy-httpd" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489470 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="ceilometer-notification-agent" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489513 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="ceilometer-central-agent" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489528 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="sg-core" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489540 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="439bd408-2f5c-45cc-a2f7-8166a4a279c2" containerName="dnsmasq-dns" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489549 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" containerName="proxy-httpd" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.489561 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="37c6fbf7-427d-45a8-b190-439265c8d6d0" containerName="nova-manage" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.491330 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.497299 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.497746 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.510493 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.517382 4839 scope.go:117] "RemoveContainer" containerID="cbba2b10323381d9b303a23b6607bd17c5906d7437bb21c852b760d41642da03" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.520016 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.660283 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-log-httpd\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.660366 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtppj\" (UniqueName: \"kubernetes.io/projected/ab85f3f4-7277-419b-96bc-4f56d5891b16-kube-api-access-mtppj\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.660386 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-config-data\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.660411 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-scripts\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.660475 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.660596 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-run-httpd\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.660687 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.660819 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.762822 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.762895 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-log-httpd\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.762933 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtppj\" (UniqueName: \"kubernetes.io/projected/ab85f3f4-7277-419b-96bc-4f56d5891b16-kube-api-access-mtppj\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.762949 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-config-data\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.762972 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-scripts\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.763008 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.763035 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-run-httpd\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.763068 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.763677 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-log-httpd\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.763957 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-run-httpd\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.768797 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.768351 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.775385 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-scripts\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.776307 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.777399 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-config-data\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.785662 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtppj\" (UniqueName: \"kubernetes.io/projected/ab85f3f4-7277-419b-96bc-4f56d5891b16-kube-api-access-mtppj\") pod \"ceilometer-0\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " pod="openstack/ceilometer-0" Mar 21 04:46:40 crc kubenswrapper[4839]: I0321 04:46:40.812777 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:46:41 crc kubenswrapper[4839]: I0321 04:46:41.254392 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:46:41 crc kubenswrapper[4839]: I0321 04:46:41.421032 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab85f3f4-7277-419b-96bc-4f56d5891b16","Type":"ContainerStarted","Data":"a7cee1eec896bbb6d355b98092ee2f6320ccf9ad32ba43390936af295231ddd4"} Mar 21 04:46:41 crc kubenswrapper[4839]: E0321 04:46:41.537073 4839 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 04:46:41 crc kubenswrapper[4839]: E0321 04:46:41.543400 4839 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 04:46:41 crc kubenswrapper[4839]: E0321 04:46:41.573040 4839 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 04:46:41 crc kubenswrapper[4839]: E0321 04:46:41.573125 4839 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="62694a5a-1565-4831-bff3-504a782692bb" containerName="nova-scheduler-scheduler" Mar 21 04:46:41 crc kubenswrapper[4839]: I0321 04:46:41.860260 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:41 crc kubenswrapper[4839]: I0321 04:46:41.984872 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-combined-ca-bundle\") pod \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " Mar 21 04:46:41 crc kubenswrapper[4839]: I0321 04:46:41.984921 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-config-data\") pod \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " Mar 21 04:46:41 crc kubenswrapper[4839]: I0321 04:46:41.985024 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rrz9\" (UniqueName: \"kubernetes.io/projected/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-kube-api-access-6rrz9\") pod \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " Mar 21 04:46:41 crc kubenswrapper[4839]: I0321 04:46:41.985103 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-scripts\") pod \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\" (UID: \"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d\") " Mar 21 04:46:41 crc kubenswrapper[4839]: I0321 04:46:41.990578 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-kube-api-access-6rrz9" (OuterVolumeSpecName: "kube-api-access-6rrz9") pod "500decd4-2b92-4e52-bfa8-bb8d1fe13b9d" (UID: "500decd4-2b92-4e52-bfa8-bb8d1fe13b9d"). InnerVolumeSpecName "kube-api-access-6rrz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:46:41 crc kubenswrapper[4839]: I0321 04:46:41.990555 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-scripts" (OuterVolumeSpecName: "scripts") pod "500decd4-2b92-4e52-bfa8-bb8d1fe13b9d" (UID: "500decd4-2b92-4e52-bfa8-bb8d1fe13b9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.014156 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "500decd4-2b92-4e52-bfa8-bb8d1fe13b9d" (UID: "500decd4-2b92-4e52-bfa8-bb8d1fe13b9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.018684 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-config-data" (OuterVolumeSpecName: "config-data") pod "500decd4-2b92-4e52-bfa8-bb8d1fe13b9d" (UID: "500decd4-2b92-4e52-bfa8-bb8d1fe13b9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.090546 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.090614 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.090629 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.090642 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rrz9\" (UniqueName: \"kubernetes.io/projected/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d-kube-api-access-6rrz9\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.435117 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jznl6" event={"ID":"500decd4-2b92-4e52-bfa8-bb8d1fe13b9d","Type":"ContainerDied","Data":"adab925016451244fae9f2cf83f23ed7b20a7f3728fde316dcb382033aa897aa"} Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.437021 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adab925016451244fae9f2cf83f23ed7b20a7f3728fde316dcb382033aa897aa" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.437126 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab85f3f4-7277-419b-96bc-4f56d5891b16","Type":"ContainerStarted","Data":"c9624ee09cc62f49f1e2db6cc40410325ea975a44c5ea0f1eaa54772bc90e8de"} Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.435309 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jznl6" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.468561 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d47edf7-95e3-4bb5-ab87-27c9db5d05d6" path="/var/lib/kubelet/pods/8d47edf7-95e3-4bb5-ab87-27c9db5d05d6/volumes" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.503722 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 21 04:46:42 crc kubenswrapper[4839]: E0321 04:46:42.504344 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="500decd4-2b92-4e52-bfa8-bb8d1fe13b9d" containerName="nova-cell1-conductor-db-sync" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.504369 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="500decd4-2b92-4e52-bfa8-bb8d1fe13b9d" containerName="nova-cell1-conductor-db-sync" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.504730 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="500decd4-2b92-4e52-bfa8-bb8d1fe13b9d" containerName="nova-cell1-conductor-db-sync" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.505545 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.508745 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.527069 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.701138 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3194b187-fe06-4eed-b725-995cef2b05a0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3194b187-fe06-4eed-b725-995cef2b05a0\") " pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.701219 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smpbc\" (UniqueName: \"kubernetes.io/projected/3194b187-fe06-4eed-b725-995cef2b05a0-kube-api-access-smpbc\") pod \"nova-cell1-conductor-0\" (UID: \"3194b187-fe06-4eed-b725-995cef2b05a0\") " pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.701286 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3194b187-fe06-4eed-b725-995cef2b05a0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3194b187-fe06-4eed-b725-995cef2b05a0\") " pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.803115 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3194b187-fe06-4eed-b725-995cef2b05a0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3194b187-fe06-4eed-b725-995cef2b05a0\") " pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.803244 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3194b187-fe06-4eed-b725-995cef2b05a0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3194b187-fe06-4eed-b725-995cef2b05a0\") " pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.803287 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smpbc\" (UniqueName: \"kubernetes.io/projected/3194b187-fe06-4eed-b725-995cef2b05a0-kube-api-access-smpbc\") pod \"nova-cell1-conductor-0\" (UID: \"3194b187-fe06-4eed-b725-995cef2b05a0\") " pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.808771 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3194b187-fe06-4eed-b725-995cef2b05a0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3194b187-fe06-4eed-b725-995cef2b05a0\") " pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.812633 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3194b187-fe06-4eed-b725-995cef2b05a0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3194b187-fe06-4eed-b725-995cef2b05a0\") " pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:42 crc kubenswrapper[4839]: I0321 04:46:42.840946 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smpbc\" (UniqueName: \"kubernetes.io/projected/3194b187-fe06-4eed-b725-995cef2b05a0-kube-api-access-smpbc\") pod \"nova-cell1-conductor-0\" (UID: \"3194b187-fe06-4eed-b725-995cef2b05a0\") " pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.132561 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.266221 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.413451 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zk4h\" (UniqueName: \"kubernetes.io/projected/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-kube-api-access-4zk4h\") pod \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.413826 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-logs\") pod \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.413973 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-combined-ca-bundle\") pod \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.414009 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-config-data\") pod \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\" (UID: \"60e89b8f-2e6a-43a6-a9de-162a457cc5fb\") " Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.414427 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-logs" (OuterVolumeSpecName: "logs") pod "60e89b8f-2e6a-43a6-a9de-162a457cc5fb" (UID: "60e89b8f-2e6a-43a6-a9de-162a457cc5fb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.414852 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.423160 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-kube-api-access-4zk4h" (OuterVolumeSpecName: "kube-api-access-4zk4h") pod "60e89b8f-2e6a-43a6-a9de-162a457cc5fb" (UID: "60e89b8f-2e6a-43a6-a9de-162a457cc5fb"). InnerVolumeSpecName "kube-api-access-4zk4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.438680 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-config-data" (OuterVolumeSpecName: "config-data") pod "60e89b8f-2e6a-43a6-a9de-162a457cc5fb" (UID: "60e89b8f-2e6a-43a6-a9de-162a457cc5fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.450900 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60e89b8f-2e6a-43a6-a9de-162a457cc5fb" (UID: "60e89b8f-2e6a-43a6-a9de-162a457cc5fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.459931 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab85f3f4-7277-419b-96bc-4f56d5891b16","Type":"ContainerStarted","Data":"b489bca73bc4e58653d79d57303a9cbf2e7e55c12b829b4e8ad5f82f29e57974"} Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.459990 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab85f3f4-7277-419b-96bc-4f56d5891b16","Type":"ContainerStarted","Data":"81cce601ac7ae485ab54fe031a2c9780e7939ffd4d001fc1df6fa33f481f6387"} Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.461739 4839 generic.go:334] "Generic (PLEG): container finished" podID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerID="7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c" exitCode=0 Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.461762 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60e89b8f-2e6a-43a6-a9de-162a457cc5fb","Type":"ContainerDied","Data":"7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c"} Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.461777 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60e89b8f-2e6a-43a6-a9de-162a457cc5fb","Type":"ContainerDied","Data":"578df92eb43a9d3deb754fc3112c79ab2340cf1a5936b7d1362d0e02e009882d"} Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.461792 4839 scope.go:117] "RemoveContainer" containerID="7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.461893 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.511475 4839 scope.go:117] "RemoveContainer" containerID="226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.514791 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.516121 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zk4h\" (UniqueName: \"kubernetes.io/projected/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-kube-api-access-4zk4h\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.516150 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.516163 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e89b8f-2e6a-43a6-a9de-162a457cc5fb-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.527881 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.547847 4839 scope.go:117] "RemoveContainer" containerID="7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c" Mar 21 04:46:43 crc kubenswrapper[4839]: E0321 04:46:43.548356 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c\": container with ID starting with 7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c not found: ID does not exist" containerID="7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.548395 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c"} err="failed to get container status \"7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c\": rpc error: code = NotFound desc = could not find container \"7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c\": container with ID starting with 7cdfb7b00a9da31b06629cf79954f57907a802a192541e04d064b035f961823c not found: ID does not exist" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.548421 4839 scope.go:117] "RemoveContainer" containerID="226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0" Mar 21 04:46:43 crc kubenswrapper[4839]: E0321 04:46:43.551161 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0\": container with ID starting with 226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0 not found: ID does not exist" containerID="226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.551195 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0"} err="failed to get container status \"226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0\": rpc error: code = NotFound desc = could not find container \"226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0\": container with ID starting with 226ae5bc81ef92a5b9fb858af04a67ca5304b066b92652d3561cb37b483b0ed0 not found: ID does not exist" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.554394 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 21 04:46:43 crc kubenswrapper[4839]: E0321 04:46:43.554856 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerName="nova-api-api" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.554881 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerName="nova-api-api" Mar 21 04:46:43 crc kubenswrapper[4839]: E0321 04:46:43.554920 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerName="nova-api-log" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.554929 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerName="nova-api-log" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.555158 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerName="nova-api-api" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.555176 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" containerName="nova-api-log" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.556370 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.558655 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.564518 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.658443 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.685187 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.720001 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbjsj\" (UniqueName: \"kubernetes.io/projected/233bba1a-658e-4073-acb5-c80398a849f1-kube-api-access-cbjsj\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.720100 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-config-data\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.720134 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.720224 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233bba1a-658e-4073-acb5-c80398a849f1-logs\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.822855 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233bba1a-658e-4073-acb5-c80398a849f1-logs\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.823366 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbjsj\" (UniqueName: \"kubernetes.io/projected/233bba1a-658e-4073-acb5-c80398a849f1-kube-api-access-cbjsj\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.823410 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-config-data\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.823437 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.823455 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233bba1a-658e-4073-acb5-c80398a849f1-logs\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.829948 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.831090 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-config-data\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.851458 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbjsj\" (UniqueName: \"kubernetes.io/projected/233bba1a-658e-4073-acb5-c80398a849f1-kube-api-access-cbjsj\") pod \"nova-api-0\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " pod="openstack/nova-api-0" Mar 21 04:46:43 crc kubenswrapper[4839]: I0321 04:46:43.873580 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.344386 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.487491 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60e89b8f-2e6a-43a6-a9de-162a457cc5fb" path="/var/lib/kubelet/pods/60e89b8f-2e6a-43a6-a9de-162a457cc5fb/volumes" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.496083 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.509863 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3194b187-fe06-4eed-b725-995cef2b05a0","Type":"ContainerStarted","Data":"f59167442230fe9f9fb6760493e49d3f6b6cafa239d5654d1dabd78b392a18e0"} Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.509907 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3194b187-fe06-4eed-b725-995cef2b05a0","Type":"ContainerStarted","Data":"de23fe2c5f0df093ecc4c0d5c58b5ef3097e1868de09c451d0bb724cc7a1addd"} Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.509937 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.518126 4839 generic.go:334] "Generic (PLEG): container finished" podID="62694a5a-1565-4831-bff3-504a782692bb" containerID="8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32" exitCode=0 Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.518308 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.518381 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"62694a5a-1565-4831-bff3-504a782692bb","Type":"ContainerDied","Data":"8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32"} Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.519111 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"62694a5a-1565-4831-bff3-504a782692bb","Type":"ContainerDied","Data":"0b486a0ca8e515cbd2ddc4f12af8c937feaf1c88976b8dbba2cd271361b2775c"} Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.519142 4839 scope.go:117] "RemoveContainer" containerID="8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.536273 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-config-data\") pod \"62694a5a-1565-4831-bff3-504a782692bb\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.536336 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ww4d\" (UniqueName: \"kubernetes.io/projected/62694a5a-1565-4831-bff3-504a782692bb-kube-api-access-6ww4d\") pod \"62694a5a-1565-4831-bff3-504a782692bb\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.536452 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-combined-ca-bundle\") pod \"62694a5a-1565-4831-bff3-504a782692bb\" (UID: \"62694a5a-1565-4831-bff3-504a782692bb\") " Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.542109 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.542091353 podStartE2EDuration="2.542091353s" podCreationTimestamp="2026-03-21 04:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:46:44.529451197 +0000 UTC m=+1408.857237873" watchObservedRunningTime="2026-03-21 04:46:44.542091353 +0000 UTC m=+1408.869878029" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.548040 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62694a5a-1565-4831-bff3-504a782692bb-kube-api-access-6ww4d" (OuterVolumeSpecName: "kube-api-access-6ww4d") pod "62694a5a-1565-4831-bff3-504a782692bb" (UID: "62694a5a-1565-4831-bff3-504a782692bb"). InnerVolumeSpecName "kube-api-access-6ww4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.555088 4839 scope.go:117] "RemoveContainer" containerID="8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32" Mar 21 04:46:44 crc kubenswrapper[4839]: E0321 04:46:44.559739 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32\": container with ID starting with 8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32 not found: ID does not exist" containerID="8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.559785 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32"} err="failed to get container status \"8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32\": rpc error: code = NotFound desc = could not find container \"8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32\": container with ID starting with 8395b25a3a6856e54e89996dc9dd678ad43d748174fdcbe34c43485b6c527d32 not found: ID does not exist" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.569612 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-config-data" (OuterVolumeSpecName: "config-data") pod "62694a5a-1565-4831-bff3-504a782692bb" (UID: "62694a5a-1565-4831-bff3-504a782692bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.570863 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62694a5a-1565-4831-bff3-504a782692bb" (UID: "62694a5a-1565-4831-bff3-504a782692bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.639034 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.639068 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ww4d\" (UniqueName: \"kubernetes.io/projected/62694a5a-1565-4831-bff3-504a782692bb-kube-api-access-6ww4d\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.639080 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62694a5a-1565-4831-bff3-504a782692bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.696948 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 21 04:46:44 crc kubenswrapper[4839]: I0321 04:46:44.696980 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.005442 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.019022 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.029787 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:46:45 crc kubenswrapper[4839]: E0321 04:46:45.030480 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62694a5a-1565-4831-bff3-504a782692bb" containerName="nova-scheduler-scheduler" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.030498 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="62694a5a-1565-4831-bff3-504a782692bb" containerName="nova-scheduler-scheduler" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.030709 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="62694a5a-1565-4831-bff3-504a782692bb" containerName="nova-scheduler-scheduler" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.031357 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.047755 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.069119 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.148334 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4szpp\" (UniqueName: \"kubernetes.io/projected/1ee2fcd4-456d-436a-ae9e-95f8224e2834-kube-api-access-4szpp\") pod \"nova-scheduler-0\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.148397 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.148439 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-config-data\") pod \"nova-scheduler-0\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.249961 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4szpp\" (UniqueName: \"kubernetes.io/projected/1ee2fcd4-456d-436a-ae9e-95f8224e2834-kube-api-access-4szpp\") pod \"nova-scheduler-0\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.250036 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.250074 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-config-data\") pod \"nova-scheduler-0\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.254142 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-config-data\") pod \"nova-scheduler-0\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.255263 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.309257 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4szpp\" (UniqueName: \"kubernetes.io/projected/1ee2fcd4-456d-436a-ae9e-95f8224e2834-kube-api-access-4szpp\") pod \"nova-scheduler-0\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.348515 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.530616 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"233bba1a-658e-4073-acb5-c80398a849f1","Type":"ContainerStarted","Data":"deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f"} Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.530973 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"233bba1a-658e-4073-acb5-c80398a849f1","Type":"ContainerStarted","Data":"31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92"} Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.530995 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"233bba1a-658e-4073-acb5-c80398a849f1","Type":"ContainerStarted","Data":"b12b8ea76c192eae2fc0d2d772e7797d514c5f048bf2be61c0b8a59a9057d453"} Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.551273 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab85f3f4-7277-419b-96bc-4f56d5891b16","Type":"ContainerStarted","Data":"9c55f46092a6529b94b2130cf1674611a911d2c1c073359151d36cbadd55f2db"} Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.551344 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.555777 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.555735862 podStartE2EDuration="2.555735862s" podCreationTimestamp="2026-03-21 04:46:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:46:45.549893527 +0000 UTC m=+1409.877680203" watchObservedRunningTime="2026-03-21 04:46:45.555735862 +0000 UTC m=+1409.883522568" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.589111 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.022949639 podStartE2EDuration="5.58908796s" podCreationTimestamp="2026-03-21 04:46:40 +0000 UTC" firstStartedPulling="2026-03-21 04:46:41.26295627 +0000 UTC m=+1405.590742946" lastFinishedPulling="2026-03-21 04:46:44.829094591 +0000 UTC m=+1409.156881267" observedRunningTime="2026-03-21 04:46:45.579756758 +0000 UTC m=+1409.907543454" watchObservedRunningTime="2026-03-21 04:46:45.58908796 +0000 UTC m=+1409.916874636" Mar 21 04:46:45 crc kubenswrapper[4839]: I0321 04:46:45.850505 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:46:45 crc kubenswrapper[4839]: W0321 04:46:45.864793 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ee2fcd4_456d_436a_ae9e_95f8224e2834.slice/crio-2dc68d964bf781e74e92f46f7b166439f6d66a340baaedb7718535c26ac20b36 WatchSource:0}: Error finding container 2dc68d964bf781e74e92f46f7b166439f6d66a340baaedb7718535c26ac20b36: Status 404 returned error can't find the container with id 2dc68d964bf781e74e92f46f7b166439f6d66a340baaedb7718535c26ac20b36 Mar 21 04:46:46 crc kubenswrapper[4839]: I0321 04:46:46.469756 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62694a5a-1565-4831-bff3-504a782692bb" path="/var/lib/kubelet/pods/62694a5a-1565-4831-bff3-504a782692bb/volumes" Mar 21 04:46:46 crc kubenswrapper[4839]: I0321 04:46:46.561107 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1ee2fcd4-456d-436a-ae9e-95f8224e2834","Type":"ContainerStarted","Data":"09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978"} Mar 21 04:46:46 crc kubenswrapper[4839]: I0321 04:46:46.561155 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1ee2fcd4-456d-436a-ae9e-95f8224e2834","Type":"ContainerStarted","Data":"2dc68d964bf781e74e92f46f7b166439f6d66a340baaedb7718535c26ac20b36"} Mar 21 04:46:46 crc kubenswrapper[4839]: I0321 04:46:46.601256 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.6012325280000002 podStartE2EDuration="1.601232528s" podCreationTimestamp="2026-03-21 04:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:46:46.593761558 +0000 UTC m=+1410.921548234" watchObservedRunningTime="2026-03-21 04:46:46.601232528 +0000 UTC m=+1410.929019204" Mar 21 04:46:47 crc kubenswrapper[4839]: I0321 04:46:47.784666 4839 scope.go:117] "RemoveContainer" containerID="54072f0390a561fb948d238ef6ee4fb04223cd43a9ba8e8eef297b621c8367df" Mar 21 04:46:48 crc kubenswrapper[4839]: I0321 04:46:48.159480 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 21 04:46:50 crc kubenswrapper[4839]: I0321 04:46:50.349306 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 21 04:46:53 crc kubenswrapper[4839]: I0321 04:46:53.875249 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 04:46:53 crc kubenswrapper[4839]: I0321 04:46:53.875868 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 04:46:55 crc kubenswrapper[4839]: I0321 04:46:55.073990 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="233bba1a-658e-4073-acb5-c80398a849f1" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 04:46:55 crc kubenswrapper[4839]: I0321 04:46:55.116777 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="233bba1a-658e-4073-acb5-c80398a849f1" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 04:46:55 crc kubenswrapper[4839]: I0321 04:46:55.351141 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 21 04:46:55 crc kubenswrapper[4839]: I0321 04:46:55.377774 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 21 04:46:56 crc kubenswrapper[4839]: I0321 04:46:56.085661 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 21 04:47:01 crc kubenswrapper[4839]: I0321 04:47:01.875258 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 21 04:47:01 crc kubenswrapper[4839]: I0321 04:47:01.875832 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.800843 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.806459 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.861349 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dwgm\" (UniqueName: \"kubernetes.io/projected/205b5c5e-c09f-4b4a-8a56-f98531ad0125-kube-api-access-2dwgm\") pod \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.861486 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205b5c5e-c09f-4b4a-8a56-f98531ad0125-logs\") pod \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.861520 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-config-data\") pod \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.861711 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-combined-ca-bundle\") pod \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.861757 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-config-data\") pod \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.861773 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xbgl\" (UniqueName: \"kubernetes.io/projected/3307932f-5c67-4abb-9649-e4b3a0a19e9c-kube-api-access-5xbgl\") pod \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\" (UID: \"3307932f-5c67-4abb-9649-e4b3a0a19e9c\") " Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.861863 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-combined-ca-bundle\") pod \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\" (UID: \"205b5c5e-c09f-4b4a-8a56-f98531ad0125\") " Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.862390 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/205b5c5e-c09f-4b4a-8a56-f98531ad0125-logs" (OuterVolumeSpecName: "logs") pod "205b5c5e-c09f-4b4a-8a56-f98531ad0125" (UID: "205b5c5e-c09f-4b4a-8a56-f98531ad0125"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.867274 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/205b5c5e-c09f-4b4a-8a56-f98531ad0125-kube-api-access-2dwgm" (OuterVolumeSpecName: "kube-api-access-2dwgm") pod "205b5c5e-c09f-4b4a-8a56-f98531ad0125" (UID: "205b5c5e-c09f-4b4a-8a56-f98531ad0125"). InnerVolumeSpecName "kube-api-access-2dwgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.867592 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3307932f-5c67-4abb-9649-e4b3a0a19e9c-kube-api-access-5xbgl" (OuterVolumeSpecName: "kube-api-access-5xbgl") pod "3307932f-5c67-4abb-9649-e4b3a0a19e9c" (UID: "3307932f-5c67-4abb-9649-e4b3a0a19e9c"). InnerVolumeSpecName "kube-api-access-5xbgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.889291 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "205b5c5e-c09f-4b4a-8a56-f98531ad0125" (UID: "205b5c5e-c09f-4b4a-8a56-f98531ad0125"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.890076 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-config-data" (OuterVolumeSpecName: "config-data") pod "205b5c5e-c09f-4b4a-8a56-f98531ad0125" (UID: "205b5c5e-c09f-4b4a-8a56-f98531ad0125"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.894374 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-config-data" (OuterVolumeSpecName: "config-data") pod "3307932f-5c67-4abb-9649-e4b3a0a19e9c" (UID: "3307932f-5c67-4abb-9649-e4b3a0a19e9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.901549 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3307932f-5c67-4abb-9649-e4b3a0a19e9c" (UID: "3307932f-5c67-4abb-9649-e4b3a0a19e9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.963897 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205b5c5e-c09f-4b4a-8a56-f98531ad0125-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.963951 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.963967 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.963982 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3307932f-5c67-4abb-9649-e4b3a0a19e9c-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.963994 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xbgl\" (UniqueName: \"kubernetes.io/projected/3307932f-5c67-4abb-9649-e4b3a0a19e9c-kube-api-access-5xbgl\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.964006 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205b5c5e-c09f-4b4a-8a56-f98531ad0125-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:02 crc kubenswrapper[4839]: I0321 04:47:02.964018 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dwgm\" (UniqueName: \"kubernetes.io/projected/205b5c5e-c09f-4b4a-8a56-f98531ad0125-kube-api-access-2dwgm\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.158875 4839 generic.go:334] "Generic (PLEG): container finished" podID="3307932f-5c67-4abb-9649-e4b3a0a19e9c" containerID="09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae" exitCode=137 Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.158974 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3307932f-5c67-4abb-9649-e4b3a0a19e9c","Type":"ContainerDied","Data":"09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae"} Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.158983 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.159001 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3307932f-5c67-4abb-9649-e4b3a0a19e9c","Type":"ContainerDied","Data":"51088d3062d2b2360e5c4a54fc629b5fbeeafa49a6e356a501876595e528c519"} Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.159019 4839 scope.go:117] "RemoveContainer" containerID="09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.160924 4839 generic.go:334] "Generic (PLEG): container finished" podID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" containerID="1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea" exitCode=137 Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.161070 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"205b5c5e-c09f-4b4a-8a56-f98531ad0125","Type":"ContainerDied","Data":"1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea"} Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.161122 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"205b5c5e-c09f-4b4a-8a56-f98531ad0125","Type":"ContainerDied","Data":"5792eedab352d18af4b7af67287b836849e3b15e2d915a2161d15245e06868bd"} Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.161238 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.198725 4839 scope.go:117] "RemoveContainer" containerID="09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae" Mar 21 04:47:03 crc kubenswrapper[4839]: E0321 04:47:03.199401 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae\": container with ID starting with 09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae not found: ID does not exist" containerID="09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.199448 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae"} err="failed to get container status \"09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae\": rpc error: code = NotFound desc = could not find container \"09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae\": container with ID starting with 09c6f9251092b481adefc14b90420fb6a766a56c4477a842505f3e261b3621ae not found: ID does not exist" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.199471 4839 scope.go:117] "RemoveContainer" containerID="1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.203739 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.225238 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.237793 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.241209 4839 scope.go:117] "RemoveContainer" containerID="3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.248936 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 04:47:03 crc kubenswrapper[4839]: E0321 04:47:03.249438 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" containerName="nova-metadata-metadata" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.249461 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" containerName="nova-metadata-metadata" Mar 21 04:47:03 crc kubenswrapper[4839]: E0321 04:47:03.249493 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3307932f-5c67-4abb-9649-e4b3a0a19e9c" containerName="nova-cell1-novncproxy-novncproxy" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.249502 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="3307932f-5c67-4abb-9649-e4b3a0a19e9c" containerName="nova-cell1-novncproxy-novncproxy" Mar 21 04:47:03 crc kubenswrapper[4839]: E0321 04:47:03.249516 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" containerName="nova-metadata-log" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.249524 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" containerName="nova-metadata-log" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.249739 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="3307932f-5c67-4abb-9649-e4b3a0a19e9c" containerName="nova-cell1-novncproxy-novncproxy" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.249752 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" containerName="nova-metadata-metadata" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.249767 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" containerName="nova-metadata-log" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.250527 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.256412 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.256703 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.256723 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.259661 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.273074 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.284538 4839 scope.go:117] "RemoveContainer" containerID="1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea" Mar 21 04:47:03 crc kubenswrapper[4839]: E0321 04:47:03.285041 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea\": container with ID starting with 1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea not found: ID does not exist" containerID="1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.285069 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea"} err="failed to get container status \"1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea\": rpc error: code = NotFound desc = could not find container \"1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea\": container with ID starting with 1410d57874493b657cf6a91a40179a2fad0e19711c82366afcf394ece3f3c6ea not found: ID does not exist" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.285090 4839 scope.go:117] "RemoveContainer" containerID="3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.285714 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.287742 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: E0321 04:47:03.290175 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c\": container with ID starting with 3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c not found: ID does not exist" containerID="3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.290262 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c"} err="failed to get container status \"3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c\": rpc error: code = NotFound desc = could not find container \"3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c\": container with ID starting with 3f29eb5cddc75c3558b62b85ba6c233f60869930c7d451ce2bed01578fe06e6c not found: ID does not exist" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.290560 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.290803 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.303755 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.370653 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcxww\" (UniqueName: \"kubernetes.io/projected/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-kube-api-access-tcxww\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.371347 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.371391 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.371475 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.371622 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.475214 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.475301 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.475358 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.475618 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.475668 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcxww\" (UniqueName: \"kubernetes.io/projected/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-kube-api-access-tcxww\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.475802 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b7fw\" (UniqueName: \"kubernetes.io/projected/8e28a9be-2244-43bb-9043-2ededa502897-kube-api-access-2b7fw\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.475885 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-config-data\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.475936 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.475971 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.476020 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e28a9be-2244-43bb-9043-2ededa502897-logs\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.488670 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.488684 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.488732 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.488748 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.493862 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcxww\" (UniqueName: \"kubernetes.io/projected/9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2-kube-api-access-tcxww\") pod \"nova-cell1-novncproxy-0\" (UID: \"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.569513 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.577356 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b7fw\" (UniqueName: \"kubernetes.io/projected/8e28a9be-2244-43bb-9043-2ededa502897-kube-api-access-2b7fw\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.577543 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-config-data\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.577739 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e28a9be-2244-43bb-9043-2ededa502897-logs\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.578074 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.578238 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.578394 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e28a9be-2244-43bb-9043-2ededa502897-logs\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.581175 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-config-data\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.581453 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.581964 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.593488 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b7fw\" (UniqueName: \"kubernetes.io/projected/8e28a9be-2244-43bb-9043-2ededa502897-kube-api-access-2b7fw\") pod \"nova-metadata-0\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.605012 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.883142 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.884497 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.887666 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 21 04:47:03 crc kubenswrapper[4839]: I0321 04:47:03.992288 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 21 04:47:03 crc kubenswrapper[4839]: W0321 04:47:03.993259 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ddf8fc2_ec2a_4b98_aa76_2dc43426e3f2.slice/crio-b3ba58611e9b49d710ea676883c59ee0efd6c5274af7bdb09c8a550e55c06599 WatchSource:0}: Error finding container b3ba58611e9b49d710ea676883c59ee0efd6c5274af7bdb09c8a550e55c06599: Status 404 returned error can't find the container with id b3ba58611e9b49d710ea676883c59ee0efd6c5274af7bdb09c8a550e55c06599 Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.100149 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:04 crc kubenswrapper[4839]: W0321 04:47:04.101255 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e28a9be_2244_43bb_9043_2ededa502897.slice/crio-6d254a4eb3f7d7fdb366dc10b2c861d9dba719fa5c681ffb26fb0cc817d0f6f3 WatchSource:0}: Error finding container 6d254a4eb3f7d7fdb366dc10b2c861d9dba719fa5c681ffb26fb0cc817d0f6f3: Status 404 returned error can't find the container with id 6d254a4eb3f7d7fdb366dc10b2c861d9dba719fa5c681ffb26fb0cc817d0f6f3 Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.175060 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2","Type":"ContainerStarted","Data":"b3ba58611e9b49d710ea676883c59ee0efd6c5274af7bdb09c8a550e55c06599"} Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.180689 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e28a9be-2244-43bb-9043-2ededa502897","Type":"ContainerStarted","Data":"6d254a4eb3f7d7fdb366dc10b2c861d9dba719fa5c681ffb26fb0cc817d0f6f3"} Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.188046 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.397462 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6862l"] Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.401889 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.418067 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6862l"] Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.473707 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="205b5c5e-c09f-4b4a-8a56-f98531ad0125" path="/var/lib/kubelet/pods/205b5c5e-c09f-4b4a-8a56-f98531ad0125/volumes" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.474480 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3307932f-5c67-4abb-9649-e4b3a0a19e9c" path="/var/lib/kubelet/pods/3307932f-5c67-4abb-9649-e4b3a0a19e9c/volumes" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.500023 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.500130 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-config\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.500159 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.500214 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.500275 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.500318 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfp48\" (UniqueName: \"kubernetes.io/projected/f0b06ab0-2209-4fb3-a837-ec755b412525-kube-api-access-pfp48\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.603447 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.603542 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-config\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.603591 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.603625 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.603672 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.603705 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfp48\" (UniqueName: \"kubernetes.io/projected/f0b06ab0-2209-4fb3-a837-ec755b412525-kube-api-access-pfp48\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.604585 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-config\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.604713 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.604857 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.604869 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.605340 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.629286 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfp48\" (UniqueName: \"kubernetes.io/projected/f0b06ab0-2209-4fb3-a837-ec755b412525-kube-api-access-pfp48\") pod \"dnsmasq-dns-cd5cbd7b9-6862l\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:04 crc kubenswrapper[4839]: I0321 04:47:04.851645 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:05 crc kubenswrapper[4839]: I0321 04:47:05.191712 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2","Type":"ContainerStarted","Data":"51609e39b21ec93da79fc362ba18f8cef3ba4ca1acff776bbc773773aba829bf"} Mar 21 04:47:05 crc kubenswrapper[4839]: I0321 04:47:05.194931 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e28a9be-2244-43bb-9043-2ededa502897","Type":"ContainerStarted","Data":"a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5"} Mar 21 04:47:05 crc kubenswrapper[4839]: I0321 04:47:05.194979 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e28a9be-2244-43bb-9043-2ededa502897","Type":"ContainerStarted","Data":"c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48"} Mar 21 04:47:05 crc kubenswrapper[4839]: I0321 04:47:05.228793 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.228770189 podStartE2EDuration="2.228770189s" podCreationTimestamp="2026-03-21 04:47:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:47:05.225326402 +0000 UTC m=+1429.553113098" watchObservedRunningTime="2026-03-21 04:47:05.228770189 +0000 UTC m=+1429.556556875" Mar 21 04:47:05 crc kubenswrapper[4839]: I0321 04:47:05.277907 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.277882882 podStartE2EDuration="2.277882882s" podCreationTimestamp="2026-03-21 04:47:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:47:05.25471438 +0000 UTC m=+1429.582501076" watchObservedRunningTime="2026-03-21 04:47:05.277882882 +0000 UTC m=+1429.605669568" Mar 21 04:47:05 crc kubenswrapper[4839]: I0321 04:47:05.396533 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6862l"] Mar 21 04:47:06 crc kubenswrapper[4839]: I0321 04:47:06.213422 4839 generic.go:334] "Generic (PLEG): container finished" podID="f0b06ab0-2209-4fb3-a837-ec755b412525" containerID="9c66a34072939fe1bc06d5317cefcaef381970be743e11c85ddce2a426a837fb" exitCode=0 Mar 21 04:47:06 crc kubenswrapper[4839]: I0321 04:47:06.214027 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" event={"ID":"f0b06ab0-2209-4fb3-a837-ec755b412525","Type":"ContainerDied","Data":"9c66a34072939fe1bc06d5317cefcaef381970be743e11c85ddce2a426a837fb"} Mar 21 04:47:06 crc kubenswrapper[4839]: I0321 04:47:06.214068 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" event={"ID":"f0b06ab0-2209-4fb3-a837-ec755b412525","Type":"ContainerStarted","Data":"223ef65b13d73e2f7904cd94127f13fb98845ae46f6fe3c063ed71c9184d7fbc"} Mar 21 04:47:06 crc kubenswrapper[4839]: I0321 04:47:06.370362 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:47:06 crc kubenswrapper[4839]: I0321 04:47:06.371253 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="ceilometer-central-agent" containerID="cri-o://c9624ee09cc62f49f1e2db6cc40410325ea975a44c5ea0f1eaa54772bc90e8de" gracePeriod=30 Mar 21 04:47:06 crc kubenswrapper[4839]: I0321 04:47:06.371323 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="proxy-httpd" containerID="cri-o://9c55f46092a6529b94b2130cf1674611a911d2c1c073359151d36cbadd55f2db" gracePeriod=30 Mar 21 04:47:06 crc kubenswrapper[4839]: I0321 04:47:06.371368 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="sg-core" containerID="cri-o://b489bca73bc4e58653d79d57303a9cbf2e7e55c12b829b4e8ad5f82f29e57974" gracePeriod=30 Mar 21 04:47:06 crc kubenswrapper[4839]: I0321 04:47:06.371425 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="ceilometer-notification-agent" containerID="cri-o://81cce601ac7ae485ab54fe031a2c9780e7939ffd4d001fc1df6fa33f481f6387" gracePeriod=30 Mar 21 04:47:06 crc kubenswrapper[4839]: I0321 04:47:06.386272 4839 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.206:3000/\": read tcp 10.217.0.2:57470->10.217.0.206:3000: read: connection reset by peer" Mar 21 04:47:06 crc kubenswrapper[4839]: I0321 04:47:06.589275 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:06 crc kubenswrapper[4839]: E0321 04:47:06.787723 4839 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab85f3f4_7277_419b_96bc_4f56d5891b16.slice/crio-9c55f46092a6529b94b2130cf1674611a911d2c1c073359151d36cbadd55f2db.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab85f3f4_7277_419b_96bc_4f56d5891b16.slice/crio-conmon-9c55f46092a6529b94b2130cf1674611a911d2c1c073359151d36cbadd55f2db.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.227333 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" event={"ID":"f0b06ab0-2209-4fb3-a837-ec755b412525","Type":"ContainerStarted","Data":"49e68a91d6df7e43ddd3ea0fec63512f2ded0793c00e4d188853502265e78a28"} Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.227661 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.255442 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" podStartSLOduration=3.2554163799999998 podStartE2EDuration="3.25541638s" podCreationTimestamp="2026-03-21 04:47:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:47:07.249274577 +0000 UTC m=+1431.577061263" watchObservedRunningTime="2026-03-21 04:47:07.25541638 +0000 UTC m=+1431.583203056" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.258127 4839 generic.go:334] "Generic (PLEG): container finished" podID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerID="9c55f46092a6529b94b2130cf1674611a911d2c1c073359151d36cbadd55f2db" exitCode=0 Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.258163 4839 generic.go:334] "Generic (PLEG): container finished" podID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerID="b489bca73bc4e58653d79d57303a9cbf2e7e55c12b829b4e8ad5f82f29e57974" exitCode=2 Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.258175 4839 generic.go:334] "Generic (PLEG): container finished" podID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerID="81cce601ac7ae485ab54fe031a2c9780e7939ffd4d001fc1df6fa33f481f6387" exitCode=0 Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.258185 4839 generic.go:334] "Generic (PLEG): container finished" podID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerID="c9624ee09cc62f49f1e2db6cc40410325ea975a44c5ea0f1eaa54772bc90e8de" exitCode=0 Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.258381 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="233bba1a-658e-4073-acb5-c80398a849f1" containerName="nova-api-log" containerID="cri-o://31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92" gracePeriod=30 Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.258682 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab85f3f4-7277-419b-96bc-4f56d5891b16","Type":"ContainerDied","Data":"9c55f46092a6529b94b2130cf1674611a911d2c1c073359151d36cbadd55f2db"} Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.258720 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab85f3f4-7277-419b-96bc-4f56d5891b16","Type":"ContainerDied","Data":"b489bca73bc4e58653d79d57303a9cbf2e7e55c12b829b4e8ad5f82f29e57974"} Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.258736 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab85f3f4-7277-419b-96bc-4f56d5891b16","Type":"ContainerDied","Data":"81cce601ac7ae485ab54fe031a2c9780e7939ffd4d001fc1df6fa33f481f6387"} Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.258746 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab85f3f4-7277-419b-96bc-4f56d5891b16","Type":"ContainerDied","Data":"c9624ee09cc62f49f1e2db6cc40410325ea975a44c5ea0f1eaa54772bc90e8de"} Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.259061 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="233bba1a-658e-4073-acb5-c80398a849f1" containerName="nova-api-api" containerID="cri-o://deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f" gracePeriod=30 Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.492309 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.672546 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-config-data\") pod \"ab85f3f4-7277-419b-96bc-4f56d5891b16\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.672630 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-log-httpd\") pod \"ab85f3f4-7277-419b-96bc-4f56d5891b16\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.672745 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-run-httpd\") pod \"ab85f3f4-7277-419b-96bc-4f56d5891b16\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.672829 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-sg-core-conf-yaml\") pod \"ab85f3f4-7277-419b-96bc-4f56d5891b16\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.672873 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-combined-ca-bundle\") pod \"ab85f3f4-7277-419b-96bc-4f56d5891b16\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.672899 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-scripts\") pod \"ab85f3f4-7277-419b-96bc-4f56d5891b16\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.672934 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtppj\" (UniqueName: \"kubernetes.io/projected/ab85f3f4-7277-419b-96bc-4f56d5891b16-kube-api-access-mtppj\") pod \"ab85f3f4-7277-419b-96bc-4f56d5891b16\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.673000 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-ceilometer-tls-certs\") pod \"ab85f3f4-7277-419b-96bc-4f56d5891b16\" (UID: \"ab85f3f4-7277-419b-96bc-4f56d5891b16\") " Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.673041 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ab85f3f4-7277-419b-96bc-4f56d5891b16" (UID: "ab85f3f4-7277-419b-96bc-4f56d5891b16"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.673578 4839 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.673865 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ab85f3f4-7277-419b-96bc-4f56d5891b16" (UID: "ab85f3f4-7277-419b-96bc-4f56d5891b16"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.680000 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab85f3f4-7277-419b-96bc-4f56d5891b16-kube-api-access-mtppj" (OuterVolumeSpecName: "kube-api-access-mtppj") pod "ab85f3f4-7277-419b-96bc-4f56d5891b16" (UID: "ab85f3f4-7277-419b-96bc-4f56d5891b16"). InnerVolumeSpecName "kube-api-access-mtppj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.681677 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-scripts" (OuterVolumeSpecName: "scripts") pod "ab85f3f4-7277-419b-96bc-4f56d5891b16" (UID: "ab85f3f4-7277-419b-96bc-4f56d5891b16"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.716367 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ab85f3f4-7277-419b-96bc-4f56d5891b16" (UID: "ab85f3f4-7277-419b-96bc-4f56d5891b16"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.729745 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ab85f3f4-7277-419b-96bc-4f56d5891b16" (UID: "ab85f3f4-7277-419b-96bc-4f56d5891b16"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.775083 4839 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab85f3f4-7277-419b-96bc-4f56d5891b16-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.775119 4839 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.775134 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.775146 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtppj\" (UniqueName: \"kubernetes.io/projected/ab85f3f4-7277-419b-96bc-4f56d5891b16-kube-api-access-mtppj\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.775157 4839 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.777311 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab85f3f4-7277-419b-96bc-4f56d5891b16" (UID: "ab85f3f4-7277-419b-96bc-4f56d5891b16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.807585 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-config-data" (OuterVolumeSpecName: "config-data") pod "ab85f3f4-7277-419b-96bc-4f56d5891b16" (UID: "ab85f3f4-7277-419b-96bc-4f56d5891b16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.877608 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:07 crc kubenswrapper[4839]: I0321 04:47:07.877869 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab85f3f4-7277-419b-96bc-4f56d5891b16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.275351 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab85f3f4-7277-419b-96bc-4f56d5891b16","Type":"ContainerDied","Data":"a7cee1eec896bbb6d355b98092ee2f6320ccf9ad32ba43390936af295231ddd4"} Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.276300 4839 scope.go:117] "RemoveContainer" containerID="9c55f46092a6529b94b2130cf1674611a911d2c1c073359151d36cbadd55f2db" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.275727 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.278881 4839 generic.go:334] "Generic (PLEG): container finished" podID="233bba1a-658e-4073-acb5-c80398a849f1" containerID="31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92" exitCode=143 Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.279035 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"233bba1a-658e-4073-acb5-c80398a849f1","Type":"ContainerDied","Data":"31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92"} Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.301700 4839 scope.go:117] "RemoveContainer" containerID="b489bca73bc4e58653d79d57303a9cbf2e7e55c12b829b4e8ad5f82f29e57974" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.316825 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.335608 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.339287 4839 scope.go:117] "RemoveContainer" containerID="81cce601ac7ae485ab54fe031a2c9780e7939ffd4d001fc1df6fa33f481f6387" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.362659 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:47:08 crc kubenswrapper[4839]: E0321 04:47:08.363357 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="proxy-httpd" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.363392 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="proxy-httpd" Mar 21 04:47:08 crc kubenswrapper[4839]: E0321 04:47:08.363428 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="ceilometer-notification-agent" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.363437 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="ceilometer-notification-agent" Mar 21 04:47:08 crc kubenswrapper[4839]: E0321 04:47:08.363470 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="ceilometer-central-agent" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.363479 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="ceilometer-central-agent" Mar 21 04:47:08 crc kubenswrapper[4839]: E0321 04:47:08.363496 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="sg-core" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.363504 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="sg-core" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.363717 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="proxy-httpd" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.363733 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="sg-core" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.363753 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="ceilometer-central-agent" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.363796 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" containerName="ceilometer-notification-agent" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.366046 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.368782 4839 scope.go:117] "RemoveContainer" containerID="c9624ee09cc62f49f1e2db6cc40410325ea975a44c5ea0f1eaa54772bc90e8de" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.369393 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.370561 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.371238 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.389251 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.390303 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.390349 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-scripts\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.390390 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1041d12-2cae-4009-a3f3-9df6e219d03b-log-httpd\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.390458 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1041d12-2cae-4009-a3f3-9df6e219d03b-run-httpd\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.390507 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qncg\" (UniqueName: \"kubernetes.io/projected/d1041d12-2cae-4009-a3f3-9df6e219d03b-kube-api-access-2qncg\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.390538 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-config-data\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.390599 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.389696 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.472337 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab85f3f4-7277-419b-96bc-4f56d5891b16" path="/var/lib/kubelet/pods/ab85f3f4-7277-419b-96bc-4f56d5891b16/volumes" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.491976 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.492059 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.492089 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-scripts\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.492125 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1041d12-2cae-4009-a3f3-9df6e219d03b-log-httpd\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.492771 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1041d12-2cae-4009-a3f3-9df6e219d03b-log-httpd\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.492175 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1041d12-2cae-4009-a3f3-9df6e219d03b-run-httpd\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.493128 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1041d12-2cae-4009-a3f3-9df6e219d03b-run-httpd\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.493216 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qncg\" (UniqueName: \"kubernetes.io/projected/d1041d12-2cae-4009-a3f3-9df6e219d03b-kube-api-access-2qncg\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.493277 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-config-data\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.493659 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.497279 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-scripts\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.497985 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-config-data\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.498591 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.499025 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.500114 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1041d12-2cae-4009-a3f3-9df6e219d03b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.508695 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qncg\" (UniqueName: \"kubernetes.io/projected/d1041d12-2cae-4009-a3f3-9df6e219d03b-kube-api-access-2qncg\") pod \"ceilometer-0\" (UID: \"d1041d12-2cae-4009-a3f3-9df6e219d03b\") " pod="openstack/ceilometer-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.570051 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:08 crc kubenswrapper[4839]: I0321 04:47:08.777277 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 21 04:47:09 crc kubenswrapper[4839]: W0321 04:47:09.229907 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1041d12_2cae_4009_a3f3_9df6e219d03b.slice/crio-1f124d4e1c5b85e0428b76ea46da810507bdfa3cbd9fe3bc66187a84ade8184f WatchSource:0}: Error finding container 1f124d4e1c5b85e0428b76ea46da810507bdfa3cbd9fe3bc66187a84ade8184f: Status 404 returned error can't find the container with id 1f124d4e1c5b85e0428b76ea46da810507bdfa3cbd9fe3bc66187a84ade8184f Mar 21 04:47:09 crc kubenswrapper[4839]: I0321 04:47:09.238791 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 21 04:47:09 crc kubenswrapper[4839]: I0321 04:47:09.296185 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1041d12-2cae-4009-a3f3-9df6e219d03b","Type":"ContainerStarted","Data":"1f124d4e1c5b85e0428b76ea46da810507bdfa3cbd9fe3bc66187a84ade8184f"} Mar 21 04:47:10 crc kubenswrapper[4839]: I0321 04:47:10.304858 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1041d12-2cae-4009-a3f3-9df6e219d03b","Type":"ContainerStarted","Data":"55d2f2d12a7309ea8eebf9d85f9e7ac8e6fc4bfa05f1a19e2884034606a058b9"} Mar 21 04:47:10 crc kubenswrapper[4839]: I0321 04:47:10.930662 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:47:10 crc kubenswrapper[4839]: I0321 04:47:10.955309 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233bba1a-658e-4073-acb5-c80398a849f1-logs\") pod \"233bba1a-658e-4073-acb5-c80398a849f1\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " Mar 21 04:47:10 crc kubenswrapper[4839]: I0321 04:47:10.955436 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-config-data\") pod \"233bba1a-658e-4073-acb5-c80398a849f1\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " Mar 21 04:47:10 crc kubenswrapper[4839]: I0321 04:47:10.955521 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-combined-ca-bundle\") pod \"233bba1a-658e-4073-acb5-c80398a849f1\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " Mar 21 04:47:10 crc kubenswrapper[4839]: I0321 04:47:10.955675 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbjsj\" (UniqueName: \"kubernetes.io/projected/233bba1a-658e-4073-acb5-c80398a849f1-kube-api-access-cbjsj\") pod \"233bba1a-658e-4073-acb5-c80398a849f1\" (UID: \"233bba1a-658e-4073-acb5-c80398a849f1\") " Mar 21 04:47:10 crc kubenswrapper[4839]: I0321 04:47:10.963675 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/233bba1a-658e-4073-acb5-c80398a849f1-logs" (OuterVolumeSpecName: "logs") pod "233bba1a-658e-4073-acb5-c80398a849f1" (UID: "233bba1a-658e-4073-acb5-c80398a849f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:47:10 crc kubenswrapper[4839]: I0321 04:47:10.968151 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/233bba1a-658e-4073-acb5-c80398a849f1-kube-api-access-cbjsj" (OuterVolumeSpecName: "kube-api-access-cbjsj") pod "233bba1a-658e-4073-acb5-c80398a849f1" (UID: "233bba1a-658e-4073-acb5-c80398a849f1"). InnerVolumeSpecName "kube-api-access-cbjsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:47:10 crc kubenswrapper[4839]: I0321 04:47:10.988971 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-config-data" (OuterVolumeSpecName: "config-data") pod "233bba1a-658e-4073-acb5-c80398a849f1" (UID: "233bba1a-658e-4073-acb5-c80398a849f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:10 crc kubenswrapper[4839]: I0321 04:47:10.997522 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "233bba1a-658e-4073-acb5-c80398a849f1" (UID: "233bba1a-658e-4073-acb5-c80398a849f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.058305 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbjsj\" (UniqueName: \"kubernetes.io/projected/233bba1a-658e-4073-acb5-c80398a849f1-kube-api-access-cbjsj\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.058332 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/233bba1a-658e-4073-acb5-c80398a849f1-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.058341 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.058350 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/233bba1a-658e-4073-acb5-c80398a849f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.319739 4839 generic.go:334] "Generic (PLEG): container finished" podID="233bba1a-658e-4073-acb5-c80398a849f1" containerID="deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f" exitCode=0 Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.319799 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"233bba1a-658e-4073-acb5-c80398a849f1","Type":"ContainerDied","Data":"deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f"} Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.319825 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"233bba1a-658e-4073-acb5-c80398a849f1","Type":"ContainerDied","Data":"b12b8ea76c192eae2fc0d2d772e7797d514c5f048bf2be61c0b8a59a9057d453"} Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.319841 4839 scope.go:117] "RemoveContainer" containerID="deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.319934 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.326282 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1041d12-2cae-4009-a3f3-9df6e219d03b","Type":"ContainerStarted","Data":"1ca8631fd39c71f0c76e109d72496b0706ab64826ec05aa45944e9a60f5abc30"} Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.367654 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.371335 4839 scope.go:117] "RemoveContainer" containerID="31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.393754 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.426974 4839 scope.go:117] "RemoveContainer" containerID="deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f" Mar 21 04:47:11 crc kubenswrapper[4839]: E0321 04:47:11.431279 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f\": container with ID starting with deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f not found: ID does not exist" containerID="deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.432185 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f"} err="failed to get container status \"deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f\": rpc error: code = NotFound desc = could not find container \"deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f\": container with ID starting with deb071d3d18e498c97565ddfbf97de4939f8ca911621645835647f16b1a60c9f not found: ID does not exist" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.432225 4839 scope.go:117] "RemoveContainer" containerID="31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.432591 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:11 crc kubenswrapper[4839]: E0321 04:47:11.432754 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92\": container with ID starting with 31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92 not found: ID does not exist" containerID="31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.432814 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92"} err="failed to get container status \"31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92\": rpc error: code = NotFound desc = could not find container \"31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92\": container with ID starting with 31f18552e99ec1604ace2931e4e04c749226768da827514f9d6063a08cb84d92 not found: ID does not exist" Mar 21 04:47:11 crc kubenswrapper[4839]: E0321 04:47:11.433097 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233bba1a-658e-4073-acb5-c80398a849f1" containerName="nova-api-log" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.433165 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="233bba1a-658e-4073-acb5-c80398a849f1" containerName="nova-api-log" Mar 21 04:47:11 crc kubenswrapper[4839]: E0321 04:47:11.433341 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233bba1a-658e-4073-acb5-c80398a849f1" containerName="nova-api-api" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.433461 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="233bba1a-658e-4073-acb5-c80398a849f1" containerName="nova-api-api" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.433729 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="233bba1a-658e-4073-acb5-c80398a849f1" containerName="nova-api-api" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.433802 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="233bba1a-658e-4073-acb5-c80398a849f1" containerName="nova-api-log" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.434954 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.437164 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.441211 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.442294 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.446562 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.572110 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gj6w\" (UniqueName: \"kubernetes.io/projected/968e5045-c2d8-4fba-9011-0a81fa2b95a3-kube-api-access-7gj6w\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.572221 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-public-tls-certs\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.572251 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.572517 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/968e5045-c2d8-4fba-9011-0a81fa2b95a3-logs\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.572604 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.572650 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-config-data\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.677962 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gj6w\" (UniqueName: \"kubernetes.io/projected/968e5045-c2d8-4fba-9011-0a81fa2b95a3-kube-api-access-7gj6w\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.678050 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-public-tls-certs\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.678071 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.678130 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/968e5045-c2d8-4fba-9011-0a81fa2b95a3-logs\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.678152 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.678168 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-config-data\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.680698 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/968e5045-c2d8-4fba-9011-0a81fa2b95a3-logs\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.693411 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.693824 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.694034 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-public-tls-certs\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.694859 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-config-data\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.697836 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gj6w\" (UniqueName: \"kubernetes.io/projected/968e5045-c2d8-4fba-9011-0a81fa2b95a3-kube-api-access-7gj6w\") pod \"nova-api-0\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " pod="openstack/nova-api-0" Mar 21 04:47:11 crc kubenswrapper[4839]: I0321 04:47:11.767438 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:47:12 crc kubenswrapper[4839]: I0321 04:47:12.253387 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:12 crc kubenswrapper[4839]: W0321 04:47:12.254920 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod968e5045_c2d8_4fba_9011_0a81fa2b95a3.slice/crio-3291d4efbcbfe77a6e6e2dd5345f3dfb280041836b19d5e1194b1100ef0d1f64 WatchSource:0}: Error finding container 3291d4efbcbfe77a6e6e2dd5345f3dfb280041836b19d5e1194b1100ef0d1f64: Status 404 returned error can't find the container with id 3291d4efbcbfe77a6e6e2dd5345f3dfb280041836b19d5e1194b1100ef0d1f64 Mar 21 04:47:12 crc kubenswrapper[4839]: I0321 04:47:12.350727 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1041d12-2cae-4009-a3f3-9df6e219d03b","Type":"ContainerStarted","Data":"fd6c67538c0b1e67c9621e2de3fecc0f885e4fbb28c56c5b9b0fbfb23c369ac4"} Mar 21 04:47:12 crc kubenswrapper[4839]: I0321 04:47:12.351764 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"968e5045-c2d8-4fba-9011-0a81fa2b95a3","Type":"ContainerStarted","Data":"3291d4efbcbfe77a6e6e2dd5345f3dfb280041836b19d5e1194b1100ef0d1f64"} Mar 21 04:47:12 crc kubenswrapper[4839]: I0321 04:47:12.463300 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="233bba1a-658e-4073-acb5-c80398a849f1" path="/var/lib/kubelet/pods/233bba1a-658e-4073-acb5-c80398a849f1/volumes" Mar 21 04:47:13 crc kubenswrapper[4839]: I0321 04:47:13.366062 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"968e5045-c2d8-4fba-9011-0a81fa2b95a3","Type":"ContainerStarted","Data":"4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c"} Mar 21 04:47:13 crc kubenswrapper[4839]: I0321 04:47:13.366453 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"968e5045-c2d8-4fba-9011-0a81fa2b95a3","Type":"ContainerStarted","Data":"2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad"} Mar 21 04:47:13 crc kubenswrapper[4839]: I0321 04:47:13.606123 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 21 04:47:13 crc kubenswrapper[4839]: I0321 04:47:13.606175 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 21 04:47:13 crc kubenswrapper[4839]: I0321 04:47:13.609346 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:13 crc kubenswrapper[4839]: I0321 04:47:13.635552 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:13 crc kubenswrapper[4839]: I0321 04:47:13.667915 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.6678926020000002 podStartE2EDuration="2.667892602s" podCreationTimestamp="2026-03-21 04:47:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:47:13.407712399 +0000 UTC m=+1437.735499085" watchObservedRunningTime="2026-03-21 04:47:13.667892602 +0000 UTC m=+1437.995679278" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.378375 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1041d12-2cae-4009-a3f3-9df6e219d03b","Type":"ContainerStarted","Data":"de018109d491c3a05b4a9a0a0f84bf56007711a954d24819a83da36ce16df2f6"} Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.396432 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.397053 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.626955733 podStartE2EDuration="6.397033604s" podCreationTimestamp="2026-03-21 04:47:08 +0000 UTC" firstStartedPulling="2026-03-21 04:47:09.232541087 +0000 UTC m=+1433.560327763" lastFinishedPulling="2026-03-21 04:47:13.002618948 +0000 UTC m=+1437.330405634" observedRunningTime="2026-03-21 04:47:14.396800377 +0000 UTC m=+1438.724587053" watchObservedRunningTime="2026-03-21 04:47:14.397033604 +0000 UTC m=+1438.724820280" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.572022 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-f7kjm"] Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.573882 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.581418 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.584415 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.588233 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-f7kjm"] Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.628724 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8e28a9be-2244-43bb-9043-2ededa502897" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.628720 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8e28a9be-2244-43bb-9043-2ededa502897" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.632913 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-config-data\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.633011 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-scripts\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.633054 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fqkd\" (UniqueName: \"kubernetes.io/projected/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-kube-api-access-9fqkd\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.633084 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.734714 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-config-data\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.734814 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-scripts\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.734852 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fqkd\" (UniqueName: \"kubernetes.io/projected/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-kube-api-access-9fqkd\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.734879 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.740842 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-scripts\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.741124 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-config-data\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.748157 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.754952 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fqkd\" (UniqueName: \"kubernetes.io/projected/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-kube-api-access-9fqkd\") pod \"nova-cell1-cell-mapping-f7kjm\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.852773 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.903004 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.937960 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-gwlp7"] Mar 21 04:47:14 crc kubenswrapper[4839]: I0321 04:47:14.938193 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" podUID="378a796b-e896-48a8-9e03-65e3b371c636" containerName="dnsmasq-dns" containerID="cri-o://d33f1cdd73480cf38d5a67e559fe413c35de9b47b49b6298618c23ca1c61bfaa" gracePeriod=10 Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.409758 4839 generic.go:334] "Generic (PLEG): container finished" podID="378a796b-e896-48a8-9e03-65e3b371c636" containerID="d33f1cdd73480cf38d5a67e559fe413c35de9b47b49b6298618c23ca1c61bfaa" exitCode=0 Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.410398 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" event={"ID":"378a796b-e896-48a8-9e03-65e3b371c636","Type":"ContainerDied","Data":"d33f1cdd73480cf38d5a67e559fe413c35de9b47b49b6298618c23ca1c61bfaa"} Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.412353 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.463518 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-f7kjm"] Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.483228 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.655244 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-svc\") pod \"378a796b-e896-48a8-9e03-65e3b371c636\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.655620 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c6c4\" (UniqueName: \"kubernetes.io/projected/378a796b-e896-48a8-9e03-65e3b371c636-kube-api-access-4c6c4\") pod \"378a796b-e896-48a8-9e03-65e3b371c636\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.655777 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-config\") pod \"378a796b-e896-48a8-9e03-65e3b371c636\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.655822 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-nb\") pod \"378a796b-e896-48a8-9e03-65e3b371c636\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.655910 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-sb\") pod \"378a796b-e896-48a8-9e03-65e3b371c636\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.655941 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-swift-storage-0\") pod \"378a796b-e896-48a8-9e03-65e3b371c636\" (UID: \"378a796b-e896-48a8-9e03-65e3b371c636\") " Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.661326 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/378a796b-e896-48a8-9e03-65e3b371c636-kube-api-access-4c6c4" (OuterVolumeSpecName: "kube-api-access-4c6c4") pod "378a796b-e896-48a8-9e03-65e3b371c636" (UID: "378a796b-e896-48a8-9e03-65e3b371c636"). InnerVolumeSpecName "kube-api-access-4c6c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.703367 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "378a796b-e896-48a8-9e03-65e3b371c636" (UID: "378a796b-e896-48a8-9e03-65e3b371c636"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.711016 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "378a796b-e896-48a8-9e03-65e3b371c636" (UID: "378a796b-e896-48a8-9e03-65e3b371c636"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.716754 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "378a796b-e896-48a8-9e03-65e3b371c636" (UID: "378a796b-e896-48a8-9e03-65e3b371c636"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.724334 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-config" (OuterVolumeSpecName: "config") pod "378a796b-e896-48a8-9e03-65e3b371c636" (UID: "378a796b-e896-48a8-9e03-65e3b371c636"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.735803 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "378a796b-e896-48a8-9e03-65e3b371c636" (UID: "378a796b-e896-48a8-9e03-65e3b371c636"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.758941 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.758984 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.758999 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.759016 4839 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.759030 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/378a796b-e896-48a8-9e03-65e3b371c636-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:15 crc kubenswrapper[4839]: I0321 04:47:15.759044 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c6c4\" (UniqueName: \"kubernetes.io/projected/378a796b-e896-48a8-9e03-65e3b371c636-kube-api-access-4c6c4\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:16 crc kubenswrapper[4839]: I0321 04:47:16.425024 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-f7kjm" event={"ID":"6c8778a4-d8b7-4331-be57-d1844b3c0f9f","Type":"ContainerStarted","Data":"07f2e48c7301d0027bae700357d24a79a9ba9d36dd4d10cd8158d308e2f8bf3d"} Mar 21 04:47:16 crc kubenswrapper[4839]: I0321 04:47:16.425451 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-f7kjm" event={"ID":"6c8778a4-d8b7-4331-be57-d1844b3c0f9f","Type":"ContainerStarted","Data":"9f409cc408d1754bbcdbaab1c8502e074d738e287574c49b7ecc38ef5824176e"} Mar 21 04:47:16 crc kubenswrapper[4839]: I0321 04:47:16.426655 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" Mar 21 04:47:16 crc kubenswrapper[4839]: I0321 04:47:16.427339 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-gwlp7" event={"ID":"378a796b-e896-48a8-9e03-65e3b371c636","Type":"ContainerDied","Data":"49d3afde602a166e8c5a9ef71743020fdf1f738c3940d641e7dae2434ec0eb13"} Mar 21 04:47:16 crc kubenswrapper[4839]: I0321 04:47:16.427374 4839 scope.go:117] "RemoveContainer" containerID="d33f1cdd73480cf38d5a67e559fe413c35de9b47b49b6298618c23ca1c61bfaa" Mar 21 04:47:16 crc kubenswrapper[4839]: I0321 04:47:16.456692 4839 scope.go:117] "RemoveContainer" containerID="880297fb77f65981125f101cde38f55dd95860faac6dbd936272889d1aa0b1aa" Mar 21 04:47:16 crc kubenswrapper[4839]: I0321 04:47:16.468382 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-f7kjm" podStartSLOduration=2.468366173 podStartE2EDuration="2.468366173s" podCreationTimestamp="2026-03-21 04:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:47:16.442529666 +0000 UTC m=+1440.770316362" watchObservedRunningTime="2026-03-21 04:47:16.468366173 +0000 UTC m=+1440.796152839" Mar 21 04:47:16 crc kubenswrapper[4839]: I0321 04:47:16.490805 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-gwlp7"] Mar 21 04:47:16 crc kubenswrapper[4839]: I0321 04:47:16.500186 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-gwlp7"] Mar 21 04:47:18 crc kubenswrapper[4839]: I0321 04:47:18.466936 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="378a796b-e896-48a8-9e03-65e3b371c636" path="/var/lib/kubelet/pods/378a796b-e896-48a8-9e03-65e3b371c636/volumes" Mar 21 04:47:21 crc kubenswrapper[4839]: I0321 04:47:21.485273 4839 generic.go:334] "Generic (PLEG): container finished" podID="6c8778a4-d8b7-4331-be57-d1844b3c0f9f" containerID="07f2e48c7301d0027bae700357d24a79a9ba9d36dd4d10cd8158d308e2f8bf3d" exitCode=0 Mar 21 04:47:21 crc kubenswrapper[4839]: I0321 04:47:21.485349 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-f7kjm" event={"ID":"6c8778a4-d8b7-4331-be57-d1844b3c0f9f","Type":"ContainerDied","Data":"07f2e48c7301d0027bae700357d24a79a9ba9d36dd4d10cd8158d308e2f8bf3d"} Mar 21 04:47:21 crc kubenswrapper[4839]: I0321 04:47:21.605087 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 21 04:47:21 crc kubenswrapper[4839]: I0321 04:47:21.605404 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 21 04:47:21 crc kubenswrapper[4839]: I0321 04:47:21.768383 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 04:47:21 crc kubenswrapper[4839]: I0321 04:47:21.768443 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 04:47:22 crc kubenswrapper[4839]: I0321 04:47:22.783718 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 04:47:22 crc kubenswrapper[4839]: I0321 04:47:22.783773 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 04:47:22 crc kubenswrapper[4839]: I0321 04:47:22.902869 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.004470 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fqkd\" (UniqueName: \"kubernetes.io/projected/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-kube-api-access-9fqkd\") pod \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.004592 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-scripts\") pod \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.004722 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-config-data\") pod \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.004752 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-combined-ca-bundle\") pod \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\" (UID: \"6c8778a4-d8b7-4331-be57-d1844b3c0f9f\") " Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.010797 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-scripts" (OuterVolumeSpecName: "scripts") pod "6c8778a4-d8b7-4331-be57-d1844b3c0f9f" (UID: "6c8778a4-d8b7-4331-be57-d1844b3c0f9f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.012420 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-kube-api-access-9fqkd" (OuterVolumeSpecName: "kube-api-access-9fqkd") pod "6c8778a4-d8b7-4331-be57-d1844b3c0f9f" (UID: "6c8778a4-d8b7-4331-be57-d1844b3c0f9f"). InnerVolumeSpecName "kube-api-access-9fqkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.031011 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c8778a4-d8b7-4331-be57-d1844b3c0f9f" (UID: "6c8778a4-d8b7-4331-be57-d1844b3c0f9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.049256 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-config-data" (OuterVolumeSpecName: "config-data") pod "6c8778a4-d8b7-4331-be57-d1844b3c0f9f" (UID: "6c8778a4-d8b7-4331-be57-d1844b3c0f9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.107413 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.107444 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.107455 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fqkd\" (UniqueName: \"kubernetes.io/projected/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-kube-api-access-9fqkd\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.107463 4839 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8778a4-d8b7-4331-be57-d1844b3c0f9f-scripts\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.503528 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-f7kjm" event={"ID":"6c8778a4-d8b7-4331-be57-d1844b3c0f9f","Type":"ContainerDied","Data":"9f409cc408d1754bbcdbaab1c8502e074d738e287574c49b7ecc38ef5824176e"} Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.503979 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f409cc408d1754bbcdbaab1c8502e074d738e287574c49b7ecc38ef5824176e" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.503584 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-f7kjm" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.610077 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.610830 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.615187 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.698989 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.699316 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerName="nova-api-log" containerID="cri-o://2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad" gracePeriod=30 Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.699842 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerName="nova-api-api" containerID="cri-o://4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c" gracePeriod=30 Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.716000 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.716260 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1ee2fcd4-456d-436a-ae9e-95f8224e2834" containerName="nova-scheduler-scheduler" containerID="cri-o://09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978" gracePeriod=30 Mar 21 04:47:23 crc kubenswrapper[4839]: I0321 04:47:23.739768 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:24 crc kubenswrapper[4839]: I0321 04:47:24.514157 4839 generic.go:334] "Generic (PLEG): container finished" podID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerID="2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad" exitCode=143 Mar 21 04:47:24 crc kubenswrapper[4839]: I0321 04:47:24.514246 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"968e5045-c2d8-4fba-9011-0a81fa2b95a3","Type":"ContainerDied","Data":"2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad"} Mar 21 04:47:24 crc kubenswrapper[4839]: I0321 04:47:24.520659 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 21 04:47:25 crc kubenswrapper[4839]: E0321 04:47:25.350754 4839 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 04:47:25 crc kubenswrapper[4839]: E0321 04:47:25.352377 4839 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 04:47:25 crc kubenswrapper[4839]: E0321 04:47:25.353976 4839 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 21 04:47:25 crc kubenswrapper[4839]: E0321 04:47:25.354051 4839 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="1ee2fcd4-456d-436a-ae9e-95f8224e2834" containerName="nova-scheduler-scheduler" Mar 21 04:47:25 crc kubenswrapper[4839]: I0321 04:47:25.521849 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8e28a9be-2244-43bb-9043-2ededa502897" containerName="nova-metadata-log" containerID="cri-o://c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48" gracePeriod=30 Mar 21 04:47:25 crc kubenswrapper[4839]: I0321 04:47:25.522070 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8e28a9be-2244-43bb-9043-2ededa502897" containerName="nova-metadata-metadata" containerID="cri-o://a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5" gracePeriod=30 Mar 21 04:47:26 crc kubenswrapper[4839]: I0321 04:47:26.531644 4839 generic.go:334] "Generic (PLEG): container finished" podID="8e28a9be-2244-43bb-9043-2ededa502897" containerID="c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48" exitCode=143 Mar 21 04:47:26 crc kubenswrapper[4839]: I0321 04:47:26.531683 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e28a9be-2244-43bb-9043-2ededa502897","Type":"ContainerDied","Data":"c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48"} Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.491638 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.560512 4839 generic.go:334] "Generic (PLEG): container finished" podID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerID="4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c" exitCode=0 Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.560588 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"968e5045-c2d8-4fba-9011-0a81fa2b95a3","Type":"ContainerDied","Data":"4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c"} Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.560620 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"968e5045-c2d8-4fba-9011-0a81fa2b95a3","Type":"ContainerDied","Data":"3291d4efbcbfe77a6e6e2dd5345f3dfb280041836b19d5e1194b1100ef0d1f64"} Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.560642 4839 scope.go:117] "RemoveContainer" containerID="4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.560821 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.590327 4839 scope.go:117] "RemoveContainer" containerID="2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.609023 4839 scope.go:117] "RemoveContainer" containerID="4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c" Mar 21 04:47:28 crc kubenswrapper[4839]: E0321 04:47:28.609497 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c\": container with ID starting with 4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c not found: ID does not exist" containerID="4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.609528 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c"} err="failed to get container status \"4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c\": rpc error: code = NotFound desc = could not find container \"4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c\": container with ID starting with 4126f0f0236467088a06a786d560aeca176722ae7ed2730314d9af69edbaed6c not found: ID does not exist" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.609548 4839 scope.go:117] "RemoveContainer" containerID="2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad" Mar 21 04:47:28 crc kubenswrapper[4839]: E0321 04:47:28.609850 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad\": container with ID starting with 2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad not found: ID does not exist" containerID="2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.609879 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad"} err="failed to get container status \"2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad\": rpc error: code = NotFound desc = could not find container \"2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad\": container with ID starting with 2e809d0916446ccc1b6f965c38dcf8b14dcb39acbc20bb5dcd7f8eed60511dad not found: ID does not exist" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.649374 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-public-tls-certs\") pod \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.649480 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-config-data\") pod \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.649556 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/968e5045-c2d8-4fba-9011-0a81fa2b95a3-logs\") pod \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.651449 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/968e5045-c2d8-4fba-9011-0a81fa2b95a3-logs" (OuterVolumeSpecName: "logs") pod "968e5045-c2d8-4fba-9011-0a81fa2b95a3" (UID: "968e5045-c2d8-4fba-9011-0a81fa2b95a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.651669 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-combined-ca-bundle\") pod \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.651783 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gj6w\" (UniqueName: \"kubernetes.io/projected/968e5045-c2d8-4fba-9011-0a81fa2b95a3-kube-api-access-7gj6w\") pod \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.652171 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-internal-tls-certs\") pod \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\" (UID: \"968e5045-c2d8-4fba-9011-0a81fa2b95a3\") " Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.653040 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/968e5045-c2d8-4fba-9011-0a81fa2b95a3-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.668255 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/968e5045-c2d8-4fba-9011-0a81fa2b95a3-kube-api-access-7gj6w" (OuterVolumeSpecName: "kube-api-access-7gj6w") pod "968e5045-c2d8-4fba-9011-0a81fa2b95a3" (UID: "968e5045-c2d8-4fba-9011-0a81fa2b95a3"). InnerVolumeSpecName "kube-api-access-7gj6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.681120 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "968e5045-c2d8-4fba-9011-0a81fa2b95a3" (UID: "968e5045-c2d8-4fba-9011-0a81fa2b95a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.681424 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-config-data" (OuterVolumeSpecName: "config-data") pod "968e5045-c2d8-4fba-9011-0a81fa2b95a3" (UID: "968e5045-c2d8-4fba-9011-0a81fa2b95a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.698466 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "968e5045-c2d8-4fba-9011-0a81fa2b95a3" (UID: "968e5045-c2d8-4fba-9011-0a81fa2b95a3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.701100 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "968e5045-c2d8-4fba-9011-0a81fa2b95a3" (UID: "968e5045-c2d8-4fba-9011-0a81fa2b95a3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.754253 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.754295 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gj6w\" (UniqueName: \"kubernetes.io/projected/968e5045-c2d8-4fba-9011-0a81fa2b95a3-kube-api-access-7gj6w\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.754306 4839 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.754314 4839 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.754323 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/968e5045-c2d8-4fba-9011-0a81fa2b95a3-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.913473 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.943303 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.954621 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:28 crc kubenswrapper[4839]: E0321 04:47:28.955077 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="378a796b-e896-48a8-9e03-65e3b371c636" containerName="dnsmasq-dns" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.955089 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="378a796b-e896-48a8-9e03-65e3b371c636" containerName="dnsmasq-dns" Mar 21 04:47:28 crc kubenswrapper[4839]: E0321 04:47:28.955103 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerName="nova-api-api" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.955108 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerName="nova-api-api" Mar 21 04:47:28 crc kubenswrapper[4839]: E0321 04:47:28.955130 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerName="nova-api-log" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.955136 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerName="nova-api-log" Mar 21 04:47:28 crc kubenswrapper[4839]: E0321 04:47:28.955153 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="378a796b-e896-48a8-9e03-65e3b371c636" containerName="init" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.955160 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="378a796b-e896-48a8-9e03-65e3b371c636" containerName="init" Mar 21 04:47:28 crc kubenswrapper[4839]: E0321 04:47:28.955167 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c8778a4-d8b7-4331-be57-d1844b3c0f9f" containerName="nova-manage" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.955193 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8778a4-d8b7-4331-be57-d1844b3c0f9f" containerName="nova-manage" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.955349 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerName="nova-api-log" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.955362 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c8778a4-d8b7-4331-be57-d1844b3c0f9f" containerName="nova-manage" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.955374 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" containerName="nova-api-api" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.955385 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="378a796b-e896-48a8-9e03-65e3b371c636" containerName="dnsmasq-dns" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.956519 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.959126 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.959173 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.959379 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 21 04:47:28 crc kubenswrapper[4839]: I0321 04:47:28.975508 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.058490 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-logs\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.058533 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.058557 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.058694 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-public-tls-certs\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.058722 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8g5h\" (UniqueName: \"kubernetes.io/projected/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-kube-api-access-z8g5h\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.058832 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-config-data\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.061078 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.160322 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-logs\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.160366 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.160391 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.160473 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-public-tls-certs\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.160506 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8g5h\" (UniqueName: \"kubernetes.io/projected/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-kube-api-access-z8g5h\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.160586 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-config-data\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.161865 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-logs\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.165434 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-config-data\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.166145 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.166263 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-public-tls-certs\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.167126 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.195865 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8g5h\" (UniqueName: \"kubernetes.io/projected/627bf6a3-cf5d-42e1-9250-ba6684bb2cfc-kube-api-access-z8g5h\") pod \"nova-api-0\" (UID: \"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc\") " pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.261729 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b7fw\" (UniqueName: \"kubernetes.io/projected/8e28a9be-2244-43bb-9043-2ededa502897-kube-api-access-2b7fw\") pod \"8e28a9be-2244-43bb-9043-2ededa502897\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.261827 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-config-data\") pod \"8e28a9be-2244-43bb-9043-2ededa502897\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.261879 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-nova-metadata-tls-certs\") pod \"8e28a9be-2244-43bb-9043-2ededa502897\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.261935 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-combined-ca-bundle\") pod \"8e28a9be-2244-43bb-9043-2ededa502897\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.261999 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e28a9be-2244-43bb-9043-2ededa502897-logs\") pod \"8e28a9be-2244-43bb-9043-2ededa502897\" (UID: \"8e28a9be-2244-43bb-9043-2ededa502897\") " Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.262713 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e28a9be-2244-43bb-9043-2ededa502897-logs" (OuterVolumeSpecName: "logs") pod "8e28a9be-2244-43bb-9043-2ededa502897" (UID: "8e28a9be-2244-43bb-9043-2ededa502897"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.282459 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e28a9be-2244-43bb-9043-2ededa502897-kube-api-access-2b7fw" (OuterVolumeSpecName: "kube-api-access-2b7fw") pod "8e28a9be-2244-43bb-9043-2ededa502897" (UID: "8e28a9be-2244-43bb-9043-2ededa502897"). InnerVolumeSpecName "kube-api-access-2b7fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.291510 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e28a9be-2244-43bb-9043-2ededa502897" (UID: "8e28a9be-2244-43bb-9043-2ededa502897"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.319783 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-config-data" (OuterVolumeSpecName: "config-data") pod "8e28a9be-2244-43bb-9043-2ededa502897" (UID: "8e28a9be-2244-43bb-9043-2ededa502897"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.351797 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.364918 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8e28a9be-2244-43bb-9043-2ededa502897" (UID: "8e28a9be-2244-43bb-9043-2ededa502897"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.373091 4839 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e28a9be-2244-43bb-9043-2ededa502897-logs\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.373135 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b7fw\" (UniqueName: \"kubernetes.io/projected/8e28a9be-2244-43bb-9043-2ededa502897-kube-api-access-2b7fw\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.373150 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.373162 4839 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.373173 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e28a9be-2244-43bb-9043-2ededa502897-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.430024 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.574437 4839 generic.go:334] "Generic (PLEG): container finished" podID="8e28a9be-2244-43bb-9043-2ededa502897" containerID="a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5" exitCode=0 Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.574587 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e28a9be-2244-43bb-9043-2ededa502897","Type":"ContainerDied","Data":"a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5"} Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.574860 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8e28a9be-2244-43bb-9043-2ededa502897","Type":"ContainerDied","Data":"6d254a4eb3f7d7fdb366dc10b2c861d9dba719fa5c681ffb26fb0cc817d0f6f3"} Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.574678 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.574887 4839 scope.go:117] "RemoveContainer" containerID="a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.576580 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4szpp\" (UniqueName: \"kubernetes.io/projected/1ee2fcd4-456d-436a-ae9e-95f8224e2834-kube-api-access-4szpp\") pod \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.576673 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-combined-ca-bundle\") pod \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.577276 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-config-data\") pod \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\" (UID: \"1ee2fcd4-456d-436a-ae9e-95f8224e2834\") " Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.580542 4839 generic.go:334] "Generic (PLEG): container finished" podID="1ee2fcd4-456d-436a-ae9e-95f8224e2834" containerID="09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978" exitCode=0 Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.580600 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1ee2fcd4-456d-436a-ae9e-95f8224e2834","Type":"ContainerDied","Data":"09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978"} Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.580631 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1ee2fcd4-456d-436a-ae9e-95f8224e2834","Type":"ContainerDied","Data":"2dc68d964bf781e74e92f46f7b166439f6d66a340baaedb7718535c26ac20b36"} Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.580702 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.581005 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ee2fcd4-456d-436a-ae9e-95f8224e2834-kube-api-access-4szpp" (OuterVolumeSpecName: "kube-api-access-4szpp") pod "1ee2fcd4-456d-436a-ae9e-95f8224e2834" (UID: "1ee2fcd4-456d-436a-ae9e-95f8224e2834"). InnerVolumeSpecName "kube-api-access-4szpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.599844 4839 scope.go:117] "RemoveContainer" containerID="c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.610577 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-config-data" (OuterVolumeSpecName: "config-data") pod "1ee2fcd4-456d-436a-ae9e-95f8224e2834" (UID: "1ee2fcd4-456d-436a-ae9e-95f8224e2834"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.613064 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ee2fcd4-456d-436a-ae9e-95f8224e2834" (UID: "1ee2fcd4-456d-436a-ae9e-95f8224e2834"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.619138 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.641826 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.660760 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:29 crc kubenswrapper[4839]: E0321 04:47:29.661353 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e28a9be-2244-43bb-9043-2ededa502897" containerName="nova-metadata-log" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.661370 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e28a9be-2244-43bb-9043-2ededa502897" containerName="nova-metadata-log" Mar 21 04:47:29 crc kubenswrapper[4839]: E0321 04:47:29.661389 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e28a9be-2244-43bb-9043-2ededa502897" containerName="nova-metadata-metadata" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.661396 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e28a9be-2244-43bb-9043-2ededa502897" containerName="nova-metadata-metadata" Mar 21 04:47:29 crc kubenswrapper[4839]: E0321 04:47:29.661423 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee2fcd4-456d-436a-ae9e-95f8224e2834" containerName="nova-scheduler-scheduler" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.661436 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee2fcd4-456d-436a-ae9e-95f8224e2834" containerName="nova-scheduler-scheduler" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.661667 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e28a9be-2244-43bb-9043-2ededa502897" containerName="nova-metadata-metadata" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.661688 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e28a9be-2244-43bb-9043-2ededa502897" containerName="nova-metadata-log" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.661703 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ee2fcd4-456d-436a-ae9e-95f8224e2834" containerName="nova-scheduler-scheduler" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.661948 4839 scope.go:117] "RemoveContainer" containerID="a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.662953 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: E0321 04:47:29.672186 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5\": container with ID starting with a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5 not found: ID does not exist" containerID="a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.672251 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5"} err="failed to get container status \"a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5\": rpc error: code = NotFound desc = could not find container \"a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5\": container with ID starting with a015fcef30e3edb89fbd565a12ab7be168cef7141e42ffd913f3c444b15b9cb5 not found: ID does not exist" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.672284 4839 scope.go:117] "RemoveContainer" containerID="c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.673178 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.673390 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.673466 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:29 crc kubenswrapper[4839]: E0321 04:47:29.678928 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48\": container with ID starting with c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48 not found: ID does not exist" containerID="c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.678991 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48"} err="failed to get container status \"c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48\": rpc error: code = NotFound desc = could not find container \"c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48\": container with ID starting with c9bb20765c0019dc8a9e6988a830d859fed3dd4c660411ba1ffc4dc6daa22e48 not found: ID does not exist" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.679033 4839 scope.go:117] "RemoveContainer" containerID="09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.679912 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.679946 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee2fcd4-456d-436a-ae9e-95f8224e2834-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.679959 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4szpp\" (UniqueName: \"kubernetes.io/projected/1ee2fcd4-456d-436a-ae9e-95f8224e2834-kube-api-access-4szpp\") on node \"crc\" DevicePath \"\"" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.711560 4839 scope.go:117] "RemoveContainer" containerID="09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978" Mar 21 04:47:29 crc kubenswrapper[4839]: E0321 04:47:29.712275 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978\": container with ID starting with 09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978 not found: ID does not exist" containerID="09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.712322 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978"} err="failed to get container status \"09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978\": rpc error: code = NotFound desc = could not find container \"09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978\": container with ID starting with 09c33300639c44e28a412a85ce7db7748f2c84c1bd2fbec427d1d44f42a06978 not found: ID does not exist" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.781938 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aafbc7f-e890-4a32-8531-f148aeea18e6-logs\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.782012 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aafbc7f-e890-4a32-8531-f148aeea18e6-config-data\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.782160 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aafbc7f-e890-4a32-8531-f148aeea18e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.782200 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2gkg\" (UniqueName: \"kubernetes.io/projected/0aafbc7f-e890-4a32-8531-f148aeea18e6-kube-api-access-s2gkg\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.782406 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aafbc7f-e890-4a32-8531-f148aeea18e6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: W0321 04:47:29.876706 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod627bf6a3_cf5d_42e1_9250_ba6684bb2cfc.slice/crio-3283d8452731a9b98569c92c6ce618f615e4b8dd5186f3210f59ff284faeb863 WatchSource:0}: Error finding container 3283d8452731a9b98569c92c6ce618f615e4b8dd5186f3210f59ff284faeb863: Status 404 returned error can't find the container with id 3283d8452731a9b98569c92c6ce618f615e4b8dd5186f3210f59ff284faeb863 Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.877539 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.885556 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aafbc7f-e890-4a32-8531-f148aeea18e6-logs\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.885627 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aafbc7f-e890-4a32-8531-f148aeea18e6-config-data\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.885689 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aafbc7f-e890-4a32-8531-f148aeea18e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.885720 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2gkg\" (UniqueName: \"kubernetes.io/projected/0aafbc7f-e890-4a32-8531-f148aeea18e6-kube-api-access-s2gkg\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.885773 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aafbc7f-e890-4a32-8531-f148aeea18e6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.886287 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0aafbc7f-e890-4a32-8531-f148aeea18e6-logs\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.889954 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aafbc7f-e890-4a32-8531-f148aeea18e6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.890218 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aafbc7f-e890-4a32-8531-f148aeea18e6-config-data\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.890306 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aafbc7f-e890-4a32-8531-f148aeea18e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.905598 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2gkg\" (UniqueName: \"kubernetes.io/projected/0aafbc7f-e890-4a32-8531-f148aeea18e6-kube-api-access-s2gkg\") pod \"nova-metadata-0\" (UID: \"0aafbc7f-e890-4a32-8531-f148aeea18e6\") " pod="openstack/nova-metadata-0" Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.926053 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.939104 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:47:29 crc kubenswrapper[4839]: I0321 04:47:29.960613 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:29.962829 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:29.965407 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:29.972069 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:29.988844 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbecccff-0ecc-44ff-a57b-f7289b8bcf5a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a\") " pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:29.988921 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj2fn\" (UniqueName: \"kubernetes.io/projected/bbecccff-0ecc-44ff-a57b-f7289b8bcf5a-kube-api-access-tj2fn\") pod \"nova-scheduler-0\" (UID: \"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a\") " pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:29.989078 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbecccff-0ecc-44ff-a57b-f7289b8bcf5a-config-data\") pod \"nova-scheduler-0\" (UID: \"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a\") " pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:29.995998 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.089762 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbecccff-0ecc-44ff-a57b-f7289b8bcf5a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a\") " pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.090368 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj2fn\" (UniqueName: \"kubernetes.io/projected/bbecccff-0ecc-44ff-a57b-f7289b8bcf5a-kube-api-access-tj2fn\") pod \"nova-scheduler-0\" (UID: \"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a\") " pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.090465 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbecccff-0ecc-44ff-a57b-f7289b8bcf5a-config-data\") pod \"nova-scheduler-0\" (UID: \"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a\") " pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.095724 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbecccff-0ecc-44ff-a57b-f7289b8bcf5a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a\") " pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.098950 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbecccff-0ecc-44ff-a57b-f7289b8bcf5a-config-data\") pod \"nova-scheduler-0\" (UID: \"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a\") " pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.113149 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj2fn\" (UniqueName: \"kubernetes.io/projected/bbecccff-0ecc-44ff-a57b-f7289b8bcf5a-kube-api-access-tj2fn\") pod \"nova-scheduler-0\" (UID: \"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a\") " pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.292471 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.463743 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ee2fcd4-456d-436a-ae9e-95f8224e2834" path="/var/lib/kubelet/pods/1ee2fcd4-456d-436a-ae9e-95f8224e2834/volumes" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.464655 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e28a9be-2244-43bb-9043-2ededa502897" path="/var/lib/kubelet/pods/8e28a9be-2244-43bb-9043-2ededa502897/volumes" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.465321 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="968e5045-c2d8-4fba-9011-0a81fa2b95a3" path="/var/lib/kubelet/pods/968e5045-c2d8-4fba-9011-0a81fa2b95a3/volumes" Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.595542 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc","Type":"ContainerStarted","Data":"7940cd1faee5fd561ec476f66f5450aa5a8dc708421e7e79724db8e7453dedca"} Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.595596 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc","Type":"ContainerStarted","Data":"0b820abaeb5bf853c0bbac610f2e9119d63d24e6251736b33d73065353027322"} Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.595614 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"627bf6a3-cf5d-42e1-9250-ba6684bb2cfc","Type":"ContainerStarted","Data":"3283d8452731a9b98569c92c6ce618f615e4b8dd5186f3210f59ff284faeb863"} Mar 21 04:47:30 crc kubenswrapper[4839]: I0321 04:47:30.624347 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.624330559 podStartE2EDuration="2.624330559s" podCreationTimestamp="2026-03-21 04:47:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:47:30.622764035 +0000 UTC m=+1454.950550721" watchObservedRunningTime="2026-03-21 04:47:30.624330559 +0000 UTC m=+1454.952117235" Mar 21 04:47:31 crc kubenswrapper[4839]: I0321 04:47:31.139140 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 21 04:47:31 crc kubenswrapper[4839]: W0321 04:47:31.144275 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbecccff_0ecc_44ff_a57b_f7289b8bcf5a.slice/crio-532cd331f4b4bfadced6d32f461f1b3aa7af6a207e4c3134243bfb7413a97873 WatchSource:0}: Error finding container 532cd331f4b4bfadced6d32f461f1b3aa7af6a207e4c3134243bfb7413a97873: Status 404 returned error can't find the container with id 532cd331f4b4bfadced6d32f461f1b3aa7af6a207e4c3134243bfb7413a97873 Mar 21 04:47:31 crc kubenswrapper[4839]: I0321 04:47:31.147273 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 21 04:47:31 crc kubenswrapper[4839]: I0321 04:47:31.607830 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0aafbc7f-e890-4a32-8531-f148aeea18e6","Type":"ContainerStarted","Data":"a118aa3df2fe7292795057b6d29b804d3a5495d74d5bab2d0e6ec99ace13ba78"} Mar 21 04:47:31 crc kubenswrapper[4839]: I0321 04:47:31.607897 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0aafbc7f-e890-4a32-8531-f148aeea18e6","Type":"ContainerStarted","Data":"147b88cf5646198b8c45418c7c3437d1bb597e68449481a9d3059a44ca3dc5c8"} Mar 21 04:47:31 crc kubenswrapper[4839]: I0321 04:47:31.607912 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0aafbc7f-e890-4a32-8531-f148aeea18e6","Type":"ContainerStarted","Data":"ab89a0ab3ec9c73d28ffe25d85bfed521e615c51fda977373fb2f11682983456"} Mar 21 04:47:31 crc kubenswrapper[4839]: I0321 04:47:31.610361 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a","Type":"ContainerStarted","Data":"10a225aeaf244148b45f1297b659b648d1a6ae727a554cc7ee1ac3dd86eb8195"} Mar 21 04:47:31 crc kubenswrapper[4839]: I0321 04:47:31.610425 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bbecccff-0ecc-44ff-a57b-f7289b8bcf5a","Type":"ContainerStarted","Data":"532cd331f4b4bfadced6d32f461f1b3aa7af6a207e4c3134243bfb7413a97873"} Mar 21 04:47:31 crc kubenswrapper[4839]: I0321 04:47:31.637759 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.637734672 podStartE2EDuration="2.637734672s" podCreationTimestamp="2026-03-21 04:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:47:31.627424812 +0000 UTC m=+1455.955211508" watchObservedRunningTime="2026-03-21 04:47:31.637734672 +0000 UTC m=+1455.965521348" Mar 21 04:47:31 crc kubenswrapper[4839]: I0321 04:47:31.653120 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.653099555 podStartE2EDuration="2.653099555s" podCreationTimestamp="2026-03-21 04:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:47:31.640395087 +0000 UTC m=+1455.968181823" watchObservedRunningTime="2026-03-21 04:47:31.653099555 +0000 UTC m=+1455.980886231" Mar 21 04:47:35 crc kubenswrapper[4839]: I0321 04:47:35.292904 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 21 04:47:38 crc kubenswrapper[4839]: I0321 04:47:38.786948 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 21 04:47:39 crc kubenswrapper[4839]: I0321 04:47:39.353659 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 04:47:39 crc kubenswrapper[4839]: I0321 04:47:39.353704 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 21 04:47:39 crc kubenswrapper[4839]: I0321 04:47:39.996609 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 21 04:47:39 crc kubenswrapper[4839]: I0321 04:47:39.996769 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 21 04:47:40 crc kubenswrapper[4839]: I0321 04:47:40.293515 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 21 04:47:40 crc kubenswrapper[4839]: I0321 04:47:40.347385 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 21 04:47:40 crc kubenswrapper[4839]: I0321 04:47:40.366773 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="627bf6a3-cf5d-42e1-9250-ba6684bb2cfc" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.216:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 04:47:40 crc kubenswrapper[4839]: I0321 04:47:40.366841 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="627bf6a3-cf5d-42e1-9250-ba6684bb2cfc" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.216:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 04:47:40 crc kubenswrapper[4839]: I0321 04:47:40.730109 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 21 04:47:41 crc kubenswrapper[4839]: I0321 04:47:41.010734 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0aafbc7f-e890-4a32-8531-f148aeea18e6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 04:47:41 crc kubenswrapper[4839]: I0321 04:47:41.010739 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0aafbc7f-e890-4a32-8531-f148aeea18e6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 21 04:47:47 crc kubenswrapper[4839]: I0321 04:47:47.352629 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 21 04:47:47 crc kubenswrapper[4839]: I0321 04:47:47.353003 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 21 04:47:47 crc kubenswrapper[4839]: I0321 04:47:47.996671 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 21 04:47:47 crc kubenswrapper[4839]: I0321 04:47:47.996732 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 21 04:47:49 crc kubenswrapper[4839]: I0321 04:47:49.362318 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 21 04:47:49 crc kubenswrapper[4839]: I0321 04:47:49.368337 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 21 04:47:49 crc kubenswrapper[4839]: I0321 04:47:49.374833 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 21 04:47:49 crc kubenswrapper[4839]: I0321 04:47:49.791751 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 21 04:47:50 crc kubenswrapper[4839]: I0321 04:47:50.004484 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 21 04:47:50 crc kubenswrapper[4839]: I0321 04:47:50.010169 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 21 04:47:50 crc kubenswrapper[4839]: I0321 04:47:50.011520 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 21 04:47:50 crc kubenswrapper[4839]: I0321 04:47:50.794315 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 21 04:47:59 crc kubenswrapper[4839]: I0321 04:47:59.004154 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 04:47:59 crc kubenswrapper[4839]: I0321 04:47:59.928235 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 04:48:00 crc kubenswrapper[4839]: I0321 04:48:00.146647 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567808-pxvv9"] Mar 21 04:48:00 crc kubenswrapper[4839]: I0321 04:48:00.147926 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567808-pxvv9" Mar 21 04:48:00 crc kubenswrapper[4839]: I0321 04:48:00.152514 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:48:00 crc kubenswrapper[4839]: I0321 04:48:00.152541 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:48:00 crc kubenswrapper[4839]: I0321 04:48:00.157852 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:48:00 crc kubenswrapper[4839]: I0321 04:48:00.158744 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567808-pxvv9"] Mar 21 04:48:00 crc kubenswrapper[4839]: I0321 04:48:00.212263 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99cb7\" (UniqueName: \"kubernetes.io/projected/5d56af53-fce2-4320-b4fa-32b5c6798921-kube-api-access-99cb7\") pod \"auto-csr-approver-29567808-pxvv9\" (UID: \"5d56af53-fce2-4320-b4fa-32b5c6798921\") " pod="openshift-infra/auto-csr-approver-29567808-pxvv9" Mar 21 04:48:00 crc kubenswrapper[4839]: I0321 04:48:00.313936 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99cb7\" (UniqueName: \"kubernetes.io/projected/5d56af53-fce2-4320-b4fa-32b5c6798921-kube-api-access-99cb7\") pod \"auto-csr-approver-29567808-pxvv9\" (UID: \"5d56af53-fce2-4320-b4fa-32b5c6798921\") " pod="openshift-infra/auto-csr-approver-29567808-pxvv9" Mar 21 04:48:00 crc kubenswrapper[4839]: I0321 04:48:00.337774 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99cb7\" (UniqueName: \"kubernetes.io/projected/5d56af53-fce2-4320-b4fa-32b5c6798921-kube-api-access-99cb7\") pod \"auto-csr-approver-29567808-pxvv9\" (UID: \"5d56af53-fce2-4320-b4fa-32b5c6798921\") " pod="openshift-infra/auto-csr-approver-29567808-pxvv9" Mar 21 04:48:00 crc kubenswrapper[4839]: I0321 04:48:00.469533 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567808-pxvv9" Mar 21 04:48:01 crc kubenswrapper[4839]: I0321 04:48:01.002082 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567808-pxvv9"] Mar 21 04:48:01 crc kubenswrapper[4839]: I0321 04:48:01.922725 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567808-pxvv9" event={"ID":"5d56af53-fce2-4320-b4fa-32b5c6798921","Type":"ContainerStarted","Data":"4db35d3cf811a0383cb9214531f4ae97e7bb78cbf5eb7b01a56d33a37845c03c"} Mar 21 04:48:02 crc kubenswrapper[4839]: I0321 04:48:02.933871 4839 generic.go:334] "Generic (PLEG): container finished" podID="5d56af53-fce2-4320-b4fa-32b5c6798921" containerID="fe7545d66419e9d11543f534eecf214e1fa485d02ad773333c092ee39cadde88" exitCode=0 Mar 21 04:48:02 crc kubenswrapper[4839]: I0321 04:48:02.933973 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567808-pxvv9" event={"ID":"5d56af53-fce2-4320-b4fa-32b5c6798921","Type":"ContainerDied","Data":"fe7545d66419e9d11543f534eecf214e1fa485d02ad773333c092ee39cadde88"} Mar 21 04:48:03 crc kubenswrapper[4839]: I0321 04:48:03.608343 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="8028561c-b039-4400-a065-b5efee753b5f" containerName="rabbitmq" containerID="cri-o://804d2b77429b6dcf4164535d9f43dee6f0cff10defca7a0d78be2b02039b8f92" gracePeriod=604796 Mar 21 04:48:04 crc kubenswrapper[4839]: I0321 04:48:04.264701 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567808-pxvv9" Mar 21 04:48:04 crc kubenswrapper[4839]: I0321 04:48:04.395766 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99cb7\" (UniqueName: \"kubernetes.io/projected/5d56af53-fce2-4320-b4fa-32b5c6798921-kube-api-access-99cb7\") pod \"5d56af53-fce2-4320-b4fa-32b5c6798921\" (UID: \"5d56af53-fce2-4320-b4fa-32b5c6798921\") " Mar 21 04:48:04 crc kubenswrapper[4839]: I0321 04:48:04.401097 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d56af53-fce2-4320-b4fa-32b5c6798921-kube-api-access-99cb7" (OuterVolumeSpecName: "kube-api-access-99cb7") pod "5d56af53-fce2-4320-b4fa-32b5c6798921" (UID: "5d56af53-fce2-4320-b4fa-32b5c6798921"). InnerVolumeSpecName "kube-api-access-99cb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:04 crc kubenswrapper[4839]: I0321 04:48:04.498133 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99cb7\" (UniqueName: \"kubernetes.io/projected/5d56af53-fce2-4320-b4fa-32b5c6798921-kube-api-access-99cb7\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:04 crc kubenswrapper[4839]: I0321 04:48:04.950761 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="6e1d0e8c-00aa-4770-9e58-b8f706d80a35" containerName="rabbitmq" containerID="cri-o://7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318" gracePeriod=604795 Mar 21 04:48:04 crc kubenswrapper[4839]: I0321 04:48:04.956018 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567808-pxvv9" event={"ID":"5d56af53-fce2-4320-b4fa-32b5c6798921","Type":"ContainerDied","Data":"4db35d3cf811a0383cb9214531f4ae97e7bb78cbf5eb7b01a56d33a37845c03c"} Mar 21 04:48:04 crc kubenswrapper[4839]: I0321 04:48:04.956064 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4db35d3cf811a0383cb9214531f4ae97e7bb78cbf5eb7b01a56d33a37845c03c" Mar 21 04:48:04 crc kubenswrapper[4839]: I0321 04:48:04.956103 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567808-pxvv9" Mar 21 04:48:05 crc kubenswrapper[4839]: I0321 04:48:05.334031 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567802-zsmks"] Mar 21 04:48:05 crc kubenswrapper[4839]: I0321 04:48:05.343245 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567802-zsmks"] Mar 21 04:48:06 crc kubenswrapper[4839]: I0321 04:48:06.473621 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab3902e0-a483-447f-b86c-4fe8e8983152" path="/var/lib/kubelet/pods/ab3902e0-a483-447f-b86c-4fe8e8983152/volumes" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.661184 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kwl97"] Mar 21 04:48:08 crc kubenswrapper[4839]: E0321 04:48:08.662890 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d56af53-fce2-4320-b4fa-32b5c6798921" containerName="oc" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.662912 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d56af53-fce2-4320-b4fa-32b5c6798921" containerName="oc" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.663151 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d56af53-fce2-4320-b4fa-32b5c6798921" containerName="oc" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.664885 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.677758 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kwl97"] Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.776248 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-utilities\") pod \"redhat-operators-kwl97\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.776337 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-catalog-content\") pod \"redhat-operators-kwl97\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.776490 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl8vh\" (UniqueName: \"kubernetes.io/projected/346daec7-d0f8-4237-a189-2b84c2a65207-kube-api-access-nl8vh\") pod \"redhat-operators-kwl97\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.892754 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-utilities\") pod \"redhat-operators-kwl97\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.892903 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-catalog-content\") pod \"redhat-operators-kwl97\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.893008 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl8vh\" (UniqueName: \"kubernetes.io/projected/346daec7-d0f8-4237-a189-2b84c2a65207-kube-api-access-nl8vh\") pod \"redhat-operators-kwl97\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.894078 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-utilities\") pod \"redhat-operators-kwl97\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.894316 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-catalog-content\") pod \"redhat-operators-kwl97\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.916076 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl8vh\" (UniqueName: \"kubernetes.io/projected/346daec7-d0f8-4237-a189-2b84c2a65207-kube-api-access-nl8vh\") pod \"redhat-operators-kwl97\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:08 crc kubenswrapper[4839]: I0321 04:48:08.995690 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:09 crc kubenswrapper[4839]: I0321 04:48:09.475360 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kwl97"] Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.005349 4839 generic.go:334] "Generic (PLEG): container finished" podID="8028561c-b039-4400-a065-b5efee753b5f" containerID="804d2b77429b6dcf4164535d9f43dee6f0cff10defca7a0d78be2b02039b8f92" exitCode=0 Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.005647 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8028561c-b039-4400-a065-b5efee753b5f","Type":"ContainerDied","Data":"804d2b77429b6dcf4164535d9f43dee6f0cff10defca7a0d78be2b02039b8f92"} Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.007862 4839 generic.go:334] "Generic (PLEG): container finished" podID="346daec7-d0f8-4237-a189-2b84c2a65207" containerID="acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b" exitCode=0 Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.007919 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwl97" event={"ID":"346daec7-d0f8-4237-a189-2b84c2a65207","Type":"ContainerDied","Data":"acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b"} Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.007944 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwl97" event={"ID":"346daec7-d0f8-4237-a189-2b84c2a65207","Type":"ContainerStarted","Data":"4d51b3ae90fbcc99929a148d8825194077e5974286f615cf5f4f49328360ebc9"} Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.227959 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323094 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-plugins-conf\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323169 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323232 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh2cd\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-kube-api-access-vh2cd\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323314 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-server-conf\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323361 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-confd\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323393 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-erlang-cookie\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323451 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-config-data\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323499 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8028561c-b039-4400-a065-b5efee753b5f-pod-info\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323526 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8028561c-b039-4400-a065-b5efee753b5f-erlang-cookie-secret\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323650 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-plugins\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.323879 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.324023 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-tls\") pod \"8028561c-b039-4400-a065-b5efee753b5f\" (UID: \"8028561c-b039-4400-a065-b5efee753b5f\") " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.324074 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.324289 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.325265 4839 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.325286 4839 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.325296 4839 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.333973 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8028561c-b039-4400-a065-b5efee753b5f-pod-info" (OuterVolumeSpecName: "pod-info") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.361317 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.362858 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8028561c-b039-4400-a065-b5efee753b5f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.362985 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.363149 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-kube-api-access-vh2cd" (OuterVolumeSpecName: "kube-api-access-vh2cd") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "kube-api-access-vh2cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.407109 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-config-data" (OuterVolumeSpecName: "config-data") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.409138 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-server-conf" (OuterVolumeSpecName: "server-conf") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.427338 4839 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8028561c-b039-4400-a065-b5efee753b5f-pod-info\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.427673 4839 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8028561c-b039-4400-a065-b5efee753b5f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.427783 4839 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.427930 4839 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.428033 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh2cd\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-kube-api-access-vh2cd\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.428131 4839 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-server-conf\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.428443 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8028561c-b039-4400-a065-b5efee753b5f-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.479837 4839 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.494716 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8028561c-b039-4400-a065-b5efee753b5f" (UID: "8028561c-b039-4400-a065-b5efee753b5f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.530393 4839 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:10 crc kubenswrapper[4839]: I0321 04:48:10.530425 4839 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8028561c-b039-4400-a065-b5efee753b5f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.020694 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8028561c-b039-4400-a065-b5efee753b5f","Type":"ContainerDied","Data":"5eaf787d4b2014f872ad6aefa43fc8d3d3baab1a1f0af69a0017de992e3a8b54"} Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.020746 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.020771 4839 scope.go:117] "RemoveContainer" containerID="804d2b77429b6dcf4164535d9f43dee6f0cff10defca7a0d78be2b02039b8f92" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.022983 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwl97" event={"ID":"346daec7-d0f8-4237-a189-2b84c2a65207","Type":"ContainerStarted","Data":"79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb"} Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.044768 4839 scope.go:117] "RemoveContainer" containerID="fcd7e300ab111a88b888a2fc68f007c49d0404de0648aa1177c5d04bb341e74c" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.075199 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.095376 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.115320 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 04:48:11 crc kubenswrapper[4839]: E0321 04:48:11.115857 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8028561c-b039-4400-a065-b5efee753b5f" containerName="setup-container" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.115877 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8028561c-b039-4400-a065-b5efee753b5f" containerName="setup-container" Mar 21 04:48:11 crc kubenswrapper[4839]: E0321 04:48:11.115914 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8028561c-b039-4400-a065-b5efee753b5f" containerName="rabbitmq" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.115921 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8028561c-b039-4400-a065-b5efee753b5f" containerName="rabbitmq" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.116084 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8028561c-b039-4400-a065-b5efee753b5f" containerName="rabbitmq" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.117045 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.120440 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.120515 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.126061 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.126241 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-nxhtb" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.126287 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.130271 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.136376 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.136551 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.246876 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.246932 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcbzl\" (UniqueName: \"kubernetes.io/projected/bfff67da-8ea4-4798-9b8d-58a3abac4347-kube-api-access-xcbzl\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.246963 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfff67da-8ea4-4798-9b8d-58a3abac4347-config-data\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.247061 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bfff67da-8ea4-4798-9b8d-58a3abac4347-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.247241 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.247433 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.247489 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bfff67da-8ea4-4798-9b8d-58a3abac4347-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.247598 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.247634 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.247712 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bfff67da-8ea4-4798-9b8d-58a3abac4347-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.247742 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bfff67da-8ea4-4798-9b8d-58a3abac4347-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.348975 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349070 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349104 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bfff67da-8ea4-4798-9b8d-58a3abac4347-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349137 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349157 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349291 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349652 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bfff67da-8ea4-4798-9b8d-58a3abac4347-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349761 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bfff67da-8ea4-4798-9b8d-58a3abac4347-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349813 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349831 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349862 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcbzl\" (UniqueName: \"kubernetes.io/projected/bfff67da-8ea4-4798-9b8d-58a3abac4347-kube-api-access-xcbzl\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349840 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349918 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfff67da-8ea4-4798-9b8d-58a3abac4347-config-data\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.349936 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bfff67da-8ea4-4798-9b8d-58a3abac4347-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.350907 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bfff67da-8ea4-4798-9b8d-58a3abac4347-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.351298 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bfff67da-8ea4-4798-9b8d-58a3abac4347-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.351922 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfff67da-8ea4-4798-9b8d-58a3abac4347-config-data\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.355548 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bfff67da-8ea4-4798-9b8d-58a3abac4347-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.355723 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.357409 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bfff67da-8ea4-4798-9b8d-58a3abac4347-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.368242 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bfff67da-8ea4-4798-9b8d-58a3abac4347-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.375553 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcbzl\" (UniqueName: \"kubernetes.io/projected/bfff67da-8ea4-4798-9b8d-58a3abac4347-kube-api-access-xcbzl\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.386913 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"bfff67da-8ea4-4798-9b8d-58a3abac4347\") " pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.464021 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.476414 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555361 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-confd\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555445 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-erlang-cookie\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555519 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-tls\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555549 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-server-conf\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555582 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-plugins-conf\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555612 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb4vz\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-kube-api-access-rb4vz\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555631 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-erlang-cookie-secret\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555658 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-plugins\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555722 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-config-data\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555749 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.555792 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-pod-info\") pod \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\" (UID: \"6e1d0e8c-00aa-4770-9e58-b8f706d80a35\") " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.556757 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.558916 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.560989 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.582612 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.582617 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.582667 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-kube-api-access-rb4vz" (OuterVolumeSpecName: "kube-api-access-rb4vz") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "kube-api-access-rb4vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.583127 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.587253 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-config-data" (OuterVolumeSpecName: "config-data") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.590126 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-pod-info" (OuterVolumeSpecName: "pod-info") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.626431 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-server-conf" (OuterVolumeSpecName: "server-conf") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.657946 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.657996 4839 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.658008 4839 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-pod-info\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.658021 4839 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.658031 4839 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.658039 4839 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-server-conf\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.658046 4839 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.658249 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb4vz\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-kube-api-access-rb4vz\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.658260 4839 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.658268 4839 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.715139 4839 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.749844 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6e1d0e8c-00aa-4770-9e58-b8f706d80a35" (UID: "6e1d0e8c-00aa-4770-9e58-b8f706d80a35"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.760171 4839 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:11 crc kubenswrapper[4839]: I0321 04:48:11.760214 4839 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6e1d0e8c-00aa-4770-9e58-b8f706d80a35-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.026109 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 21 04:48:12 crc kubenswrapper[4839]: W0321 04:48:12.031063 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfff67da_8ea4_4798_9b8d_58a3abac4347.slice/crio-46b13d5c0adc0bef49bf9499cfee788b52e0c94a7a04a01ffcc03ba32584dcd5 WatchSource:0}: Error finding container 46b13d5c0adc0bef49bf9499cfee788b52e0c94a7a04a01ffcc03ba32584dcd5: Status 404 returned error can't find the container with id 46b13d5c0adc0bef49bf9499cfee788b52e0c94a7a04a01ffcc03ba32584dcd5 Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.052264 4839 generic.go:334] "Generic (PLEG): container finished" podID="346daec7-d0f8-4237-a189-2b84c2a65207" containerID="79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb" exitCode=0 Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.052369 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwl97" event={"ID":"346daec7-d0f8-4237-a189-2b84c2a65207","Type":"ContainerDied","Data":"79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb"} Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.055407 4839 generic.go:334] "Generic (PLEG): container finished" podID="6e1d0e8c-00aa-4770-9e58-b8f706d80a35" containerID="7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318" exitCode=0 Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.055463 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6e1d0e8c-00aa-4770-9e58-b8f706d80a35","Type":"ContainerDied","Data":"7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318"} Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.055487 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6e1d0e8c-00aa-4770-9e58-b8f706d80a35","Type":"ContainerDied","Data":"13cf1811708e735c8587e5f387524078eddb6176802aa11ecbd1435c38ed0541"} Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.055502 4839 scope.go:117] "RemoveContainer" containerID="7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.055635 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.212926 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.224744 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.226386 4839 scope.go:117] "RemoveContainer" containerID="e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.239771 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 04:48:12 crc kubenswrapper[4839]: E0321 04:48:12.240242 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1d0e8c-00aa-4770-9e58-b8f706d80a35" containerName="rabbitmq" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.240260 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1d0e8c-00aa-4770-9e58-b8f706d80a35" containerName="rabbitmq" Mar 21 04:48:12 crc kubenswrapper[4839]: E0321 04:48:12.240289 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1d0e8c-00aa-4770-9e58-b8f706d80a35" containerName="setup-container" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.240296 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1d0e8c-00aa-4770-9e58-b8f706d80a35" containerName="setup-container" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.240506 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e1d0e8c-00aa-4770-9e58-b8f706d80a35" containerName="rabbitmq" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.241972 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.246011 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.246030 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.246333 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.246429 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-wq8rw" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.246514 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.246642 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.246528 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.250477 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.298541 4839 scope.go:117] "RemoveContainer" containerID="7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318" Mar 21 04:48:12 crc kubenswrapper[4839]: E0321 04:48:12.299458 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318\": container with ID starting with 7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318 not found: ID does not exist" containerID="7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.299506 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318"} err="failed to get container status \"7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318\": rpc error: code = NotFound desc = could not find container \"7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318\": container with ID starting with 7e6871f49750eb702160c82a07f561ea187c198c633a9f96f983dabe23b81318 not found: ID does not exist" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.299530 4839 scope.go:117] "RemoveContainer" containerID="e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182" Mar 21 04:48:12 crc kubenswrapper[4839]: E0321 04:48:12.299877 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182\": container with ID starting with e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182 not found: ID does not exist" containerID="e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.299917 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182"} err="failed to get container status \"e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182\": rpc error: code = NotFound desc = could not find container \"e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182\": container with ID starting with e90ba1902491587c449fe1da895e413653e91fc3602d79a9260ecadc688aa182 not found: ID does not exist" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380035 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fa82c4a0-2b0e-4e22-9e91-7fc899122414-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380114 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380143 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tbk9\" (UniqueName: \"kubernetes.io/projected/fa82c4a0-2b0e-4e22-9e91-7fc899122414-kube-api-access-6tbk9\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380291 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380456 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa82c4a0-2b0e-4e22-9e91-7fc899122414-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380511 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fa82c4a0-2b0e-4e22-9e91-7fc899122414-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380554 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fa82c4a0-2b0e-4e22-9e91-7fc899122414-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380602 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380653 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380684 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.380730 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fa82c4a0-2b0e-4e22-9e91-7fc899122414-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.465371 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e1d0e8c-00aa-4770-9e58-b8f706d80a35" path="/var/lib/kubelet/pods/6e1d0e8c-00aa-4770-9e58-b8f706d80a35/volumes" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.466282 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8028561c-b039-4400-a065-b5efee753b5f" path="/var/lib/kubelet/pods/8028561c-b039-4400-a065-b5efee753b5f/volumes" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.482451 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa82c4a0-2b0e-4e22-9e91-7fc899122414-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.482536 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fa82c4a0-2b0e-4e22-9e91-7fc899122414-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.482610 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fa82c4a0-2b0e-4e22-9e91-7fc899122414-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.482647 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.482691 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.482875 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.483205 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.483439 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.483502 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.483513 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fa82c4a0-2b0e-4e22-9e91-7fc899122414-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.483753 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fa82c4a0-2b0e-4e22-9e91-7fc899122414-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.483760 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fa82c4a0-2b0e-4e22-9e91-7fc899122414-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.483772 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fa82c4a0-2b0e-4e22-9e91-7fc899122414-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.483850 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.483874 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tbk9\" (UniqueName: \"kubernetes.io/projected/fa82c4a0-2b0e-4e22-9e91-7fc899122414-kube-api-access-6tbk9\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.483919 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.484927 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fa82c4a0-2b0e-4e22-9e91-7fc899122414-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.487320 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fa82c4a0-2b0e-4e22-9e91-7fc899122414-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.487351 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.488004 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fa82c4a0-2b0e-4e22-9e91-7fc899122414-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.489766 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fa82c4a0-2b0e-4e22-9e91-7fc899122414-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.507250 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tbk9\" (UniqueName: \"kubernetes.io/projected/fa82c4a0-2b0e-4e22-9e91-7fc899122414-kube-api-access-6tbk9\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.529370 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fa82c4a0-2b0e-4e22-9e91-7fc899122414\") " pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.596969 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-cg5zx"] Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.608312 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.615708 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.616723 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.622121 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-cg5zx"] Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.691934 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tpcn\" (UniqueName: \"kubernetes.io/projected/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-kube-api-access-9tpcn\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.692222 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.692383 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.692547 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-config\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.692677 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.692804 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.692995 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-svc\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.794751 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-config\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.794814 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.794873 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.794967 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-svc\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.795019 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tpcn\" (UniqueName: \"kubernetes.io/projected/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-kube-api-access-9tpcn\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.795042 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.795106 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.796105 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.796774 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-config\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.797492 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-svc\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.801153 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.802395 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.804466 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.826488 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tpcn\" (UniqueName: \"kubernetes.io/projected/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-kube-api-access-9tpcn\") pod \"dnsmasq-dns-d558885bc-cg5zx\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:12 crc kubenswrapper[4839]: I0321 04:48:12.933738 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:13 crc kubenswrapper[4839]: I0321 04:48:13.088134 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bfff67da-8ea4-4798-9b8d-58a3abac4347","Type":"ContainerStarted","Data":"46b13d5c0adc0bef49bf9499cfee788b52e0c94a7a04a01ffcc03ba32584dcd5"} Mar 21 04:48:13 crc kubenswrapper[4839]: I0321 04:48:13.239611 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 21 04:48:13 crc kubenswrapper[4839]: W0321 04:48:13.440124 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa3a88fe_e92a_48a2_9d53_97c2e2c16407.slice/crio-6afe4a7a44e7d79961cd510bbe96aaeab650fa9c2ae4dcc1d6d1067536db5cf3 WatchSource:0}: Error finding container 6afe4a7a44e7d79961cd510bbe96aaeab650fa9c2ae4dcc1d6d1067536db5cf3: Status 404 returned error can't find the container with id 6afe4a7a44e7d79961cd510bbe96aaeab650fa9c2ae4dcc1d6d1067536db5cf3 Mar 21 04:48:13 crc kubenswrapper[4839]: I0321 04:48:13.448323 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-cg5zx"] Mar 21 04:48:14 crc kubenswrapper[4839]: I0321 04:48:14.099647 4839 generic.go:334] "Generic (PLEG): container finished" podID="fa3a88fe-e92a-48a2-9d53-97c2e2c16407" containerID="03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc" exitCode=0 Mar 21 04:48:14 crc kubenswrapper[4839]: I0321 04:48:14.099753 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" event={"ID":"fa3a88fe-e92a-48a2-9d53-97c2e2c16407","Type":"ContainerDied","Data":"03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc"} Mar 21 04:48:14 crc kubenswrapper[4839]: I0321 04:48:14.100042 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" event={"ID":"fa3a88fe-e92a-48a2-9d53-97c2e2c16407","Type":"ContainerStarted","Data":"6afe4a7a44e7d79961cd510bbe96aaeab650fa9c2ae4dcc1d6d1067536db5cf3"} Mar 21 04:48:14 crc kubenswrapper[4839]: I0321 04:48:14.101222 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fa82c4a0-2b0e-4e22-9e91-7fc899122414","Type":"ContainerStarted","Data":"25a926428acdd1dd9442d2adc6952a7148d2fc65ae1286292b5a5776f9568879"} Mar 21 04:48:14 crc kubenswrapper[4839]: I0321 04:48:14.107332 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwl97" event={"ID":"346daec7-d0f8-4237-a189-2b84c2a65207","Type":"ContainerStarted","Data":"8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d"} Mar 21 04:48:14 crc kubenswrapper[4839]: I0321 04:48:14.112885 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bfff67da-8ea4-4798-9b8d-58a3abac4347","Type":"ContainerStarted","Data":"644c67f4886f966671f8710482c7d08251548f19faca3155012ce9a9e6664332"} Mar 21 04:48:14 crc kubenswrapper[4839]: I0321 04:48:14.171405 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kwl97" podStartSLOduration=3.470987738 podStartE2EDuration="6.171377384s" podCreationTimestamp="2026-03-21 04:48:08 +0000 UTC" firstStartedPulling="2026-03-21 04:48:10.014818196 +0000 UTC m=+1494.342604872" lastFinishedPulling="2026-03-21 04:48:12.715207842 +0000 UTC m=+1497.042994518" observedRunningTime="2026-03-21 04:48:14.152246587 +0000 UTC m=+1498.480033263" watchObservedRunningTime="2026-03-21 04:48:14.171377384 +0000 UTC m=+1498.499164070" Mar 21 04:48:15 crc kubenswrapper[4839]: I0321 04:48:15.123509 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" event={"ID":"fa3a88fe-e92a-48a2-9d53-97c2e2c16407","Type":"ContainerStarted","Data":"7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545"} Mar 21 04:48:15 crc kubenswrapper[4839]: I0321 04:48:15.123877 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:15 crc kubenswrapper[4839]: I0321 04:48:15.125334 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fa82c4a0-2b0e-4e22-9e91-7fc899122414","Type":"ContainerStarted","Data":"7ac821c6a3f3ad08d86993b9d344baebd30c5481183b9c62427b46b6443230a7"} Mar 21 04:48:15 crc kubenswrapper[4839]: I0321 04:48:15.182178 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" podStartSLOduration=3.182156365 podStartE2EDuration="3.182156365s" podCreationTimestamp="2026-03-21 04:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:48:15.14813921 +0000 UTC m=+1499.475925906" watchObservedRunningTime="2026-03-21 04:48:15.182156365 +0000 UTC m=+1499.509943061" Mar 21 04:48:18 crc kubenswrapper[4839]: I0321 04:48:18.996058 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:18 crc kubenswrapper[4839]: I0321 04:48:18.997275 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:20 crc kubenswrapper[4839]: I0321 04:48:20.048248 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kwl97" podUID="346daec7-d0f8-4237-a189-2b84c2a65207" containerName="registry-server" probeResult="failure" output=< Mar 21 04:48:20 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 04:48:20 crc kubenswrapper[4839]: > Mar 21 04:48:22 crc kubenswrapper[4839]: I0321 04:48:22.934710 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:22 crc kubenswrapper[4839]: I0321 04:48:22.995070 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6862l"] Mar 21 04:48:22 crc kubenswrapper[4839]: I0321 04:48:22.995338 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" podUID="f0b06ab0-2209-4fb3-a837-ec755b412525" containerName="dnsmasq-dns" containerID="cri-o://49e68a91d6df7e43ddd3ea0fec63512f2ded0793c00e4d188853502265e78a28" gracePeriod=10 Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.163061 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-n4nl2"] Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.166387 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.188684 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-n4nl2"] Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.202901 4839 generic.go:334] "Generic (PLEG): container finished" podID="f0b06ab0-2209-4fb3-a837-ec755b412525" containerID="49e68a91d6df7e43ddd3ea0fec63512f2ded0793c00e4d188853502265e78a28" exitCode=0 Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.202950 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" event={"ID":"f0b06ab0-2209-4fb3-a837-ec755b412525","Type":"ContainerDied","Data":"49e68a91d6df7e43ddd3ea0fec63512f2ded0793c00e4d188853502265e78a28"} Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.300534 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.300597 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkclb\" (UniqueName: \"kubernetes.io/projected/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-kube-api-access-gkclb\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.300920 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.300990 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.301033 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.301156 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.301184 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-config\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.403006 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.403075 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.403105 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.403193 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.403234 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-config\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.403330 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.403371 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkclb\" (UniqueName: \"kubernetes.io/projected/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-kube-api-access-gkclb\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.404809 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.404819 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.404914 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.405198 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-config\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.405562 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.416081 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.430453 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkclb\" (UniqueName: \"kubernetes.io/projected/a31699b4-0a8f-42c8-b7f4-319ef1d5423a-kube-api-access-gkclb\") pod \"dnsmasq-dns-78c64bc9c5-n4nl2\" (UID: \"a31699b4-0a8f-42c8-b7f4-319ef1d5423a\") " pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.541555 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.661599 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.810210 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-nb\") pod \"f0b06ab0-2209-4fb3-a837-ec755b412525\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.810304 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-sb\") pod \"f0b06ab0-2209-4fb3-a837-ec755b412525\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.810349 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-svc\") pod \"f0b06ab0-2209-4fb3-a837-ec755b412525\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.810434 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfp48\" (UniqueName: \"kubernetes.io/projected/f0b06ab0-2209-4fb3-a837-ec755b412525-kube-api-access-pfp48\") pod \"f0b06ab0-2209-4fb3-a837-ec755b412525\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.810460 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-config\") pod \"f0b06ab0-2209-4fb3-a837-ec755b412525\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.810583 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-swift-storage-0\") pod \"f0b06ab0-2209-4fb3-a837-ec755b412525\" (UID: \"f0b06ab0-2209-4fb3-a837-ec755b412525\") " Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.820130 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0b06ab0-2209-4fb3-a837-ec755b412525-kube-api-access-pfp48" (OuterVolumeSpecName: "kube-api-access-pfp48") pod "f0b06ab0-2209-4fb3-a837-ec755b412525" (UID: "f0b06ab0-2209-4fb3-a837-ec755b412525"). InnerVolumeSpecName "kube-api-access-pfp48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.863079 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f0b06ab0-2209-4fb3-a837-ec755b412525" (UID: "f0b06ab0-2209-4fb3-a837-ec755b412525"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.863813 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f0b06ab0-2209-4fb3-a837-ec755b412525" (UID: "f0b06ab0-2209-4fb3-a837-ec755b412525"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.878094 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f0b06ab0-2209-4fb3-a837-ec755b412525" (UID: "f0b06ab0-2209-4fb3-a837-ec755b412525"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.878528 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-config" (OuterVolumeSpecName: "config") pod "f0b06ab0-2209-4fb3-a837-ec755b412525" (UID: "f0b06ab0-2209-4fb3-a837-ec755b412525"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.885528 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f0b06ab0-2209-4fb3-a837-ec755b412525" (UID: "f0b06ab0-2209-4fb3-a837-ec755b412525"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.913284 4839 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.913318 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.913329 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.913338 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.913349 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfp48\" (UniqueName: \"kubernetes.io/projected/f0b06ab0-2209-4fb3-a837-ec755b412525-kube-api-access-pfp48\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:23 crc kubenswrapper[4839]: I0321 04:48:23.913358 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0b06ab0-2209-4fb3-a837-ec755b412525-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:24 crc kubenswrapper[4839]: I0321 04:48:24.026023 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-n4nl2"] Mar 21 04:48:24 crc kubenswrapper[4839]: I0321 04:48:24.216732 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" event={"ID":"f0b06ab0-2209-4fb3-a837-ec755b412525","Type":"ContainerDied","Data":"223ef65b13d73e2f7904cd94127f13fb98845ae46f6fe3c063ed71c9184d7fbc"} Mar 21 04:48:24 crc kubenswrapper[4839]: I0321 04:48:24.217065 4839 scope.go:117] "RemoveContainer" containerID="49e68a91d6df7e43ddd3ea0fec63512f2ded0793c00e4d188853502265e78a28" Mar 21 04:48:24 crc kubenswrapper[4839]: I0321 04:48:24.216764 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-6862l" Mar 21 04:48:24 crc kubenswrapper[4839]: I0321 04:48:24.218535 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" event={"ID":"a31699b4-0a8f-42c8-b7f4-319ef1d5423a","Type":"ContainerStarted","Data":"9630e345dc685e54ca50f351a3dee5431dd5ed44eb1c8385623484c27885d436"} Mar 21 04:48:24 crc kubenswrapper[4839]: I0321 04:48:24.263336 4839 scope.go:117] "RemoveContainer" containerID="9c66a34072939fe1bc06d5317cefcaef381970be743e11c85ddce2a426a837fb" Mar 21 04:48:24 crc kubenswrapper[4839]: I0321 04:48:24.281938 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6862l"] Mar 21 04:48:24 crc kubenswrapper[4839]: I0321 04:48:24.289634 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-6862l"] Mar 21 04:48:24 crc kubenswrapper[4839]: I0321 04:48:24.463980 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0b06ab0-2209-4fb3-a837-ec755b412525" path="/var/lib/kubelet/pods/f0b06ab0-2209-4fb3-a837-ec755b412525/volumes" Mar 21 04:48:25 crc kubenswrapper[4839]: I0321 04:48:25.232803 4839 generic.go:334] "Generic (PLEG): container finished" podID="a31699b4-0a8f-42c8-b7f4-319ef1d5423a" containerID="2f10ef34f922523dcddaa070f87245aa996ca456d8cc3c469dc033bc9dc5d8e7" exitCode=0 Mar 21 04:48:25 crc kubenswrapper[4839]: I0321 04:48:25.233362 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" event={"ID":"a31699b4-0a8f-42c8-b7f4-319ef1d5423a","Type":"ContainerDied","Data":"2f10ef34f922523dcddaa070f87245aa996ca456d8cc3c469dc033bc9dc5d8e7"} Mar 21 04:48:26 crc kubenswrapper[4839]: I0321 04:48:26.244115 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" event={"ID":"a31699b4-0a8f-42c8-b7f4-319ef1d5423a","Type":"ContainerStarted","Data":"3eb74633227d6594ed27aa32b037a49f3d26f7a5e99f04f430b702b54d398bab"} Mar 21 04:48:26 crc kubenswrapper[4839]: I0321 04:48:26.244410 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:26 crc kubenswrapper[4839]: I0321 04:48:26.268098 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" podStartSLOduration=3.26807993 podStartE2EDuration="3.26807993s" podCreationTimestamp="2026-03-21 04:48:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:48:26.26272264 +0000 UTC m=+1510.590509316" watchObservedRunningTime="2026-03-21 04:48:26.26807993 +0000 UTC m=+1510.595866606" Mar 21 04:48:29 crc kubenswrapper[4839]: I0321 04:48:29.061515 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:29 crc kubenswrapper[4839]: I0321 04:48:29.113332 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:29 crc kubenswrapper[4839]: I0321 04:48:29.299821 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kwl97"] Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.288541 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kwl97" podUID="346daec7-d0f8-4237-a189-2b84c2a65207" containerName="registry-server" containerID="cri-o://8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d" gracePeriod=2 Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.746029 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.849715 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-utilities\") pod \"346daec7-d0f8-4237-a189-2b84c2a65207\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.849798 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-catalog-content\") pod \"346daec7-d0f8-4237-a189-2b84c2a65207\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.850036 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl8vh\" (UniqueName: \"kubernetes.io/projected/346daec7-d0f8-4237-a189-2b84c2a65207-kube-api-access-nl8vh\") pod \"346daec7-d0f8-4237-a189-2b84c2a65207\" (UID: \"346daec7-d0f8-4237-a189-2b84c2a65207\") " Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.851672 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-utilities" (OuterVolumeSpecName: "utilities") pod "346daec7-d0f8-4237-a189-2b84c2a65207" (UID: "346daec7-d0f8-4237-a189-2b84c2a65207"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.857944 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/346daec7-d0f8-4237-a189-2b84c2a65207-kube-api-access-nl8vh" (OuterVolumeSpecName: "kube-api-access-nl8vh") pod "346daec7-d0f8-4237-a189-2b84c2a65207" (UID: "346daec7-d0f8-4237-a189-2b84c2a65207"). InnerVolumeSpecName "kube-api-access-nl8vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.953309 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl8vh\" (UniqueName: \"kubernetes.io/projected/346daec7-d0f8-4237-a189-2b84c2a65207-kube-api-access-nl8vh\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.953362 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.981497 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.981635 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:48:30 crc kubenswrapper[4839]: I0321 04:48:30.984825 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "346daec7-d0f8-4237-a189-2b84c2a65207" (UID: "346daec7-d0f8-4237-a189-2b84c2a65207"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.054562 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/346daec7-d0f8-4237-a189-2b84c2a65207-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.298840 4839 generic.go:334] "Generic (PLEG): container finished" podID="346daec7-d0f8-4237-a189-2b84c2a65207" containerID="8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d" exitCode=0 Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.298889 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwl97" event={"ID":"346daec7-d0f8-4237-a189-2b84c2a65207","Type":"ContainerDied","Data":"8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d"} Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.298910 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwl97" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.298925 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwl97" event={"ID":"346daec7-d0f8-4237-a189-2b84c2a65207","Type":"ContainerDied","Data":"4d51b3ae90fbcc99929a148d8825194077e5974286f615cf5f4f49328360ebc9"} Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.298972 4839 scope.go:117] "RemoveContainer" containerID="8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.318523 4839 scope.go:117] "RemoveContainer" containerID="79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.331696 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kwl97"] Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.340354 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kwl97"] Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.358987 4839 scope.go:117] "RemoveContainer" containerID="acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.387804 4839 scope.go:117] "RemoveContainer" containerID="8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d" Mar 21 04:48:31 crc kubenswrapper[4839]: E0321 04:48:31.388328 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d\": container with ID starting with 8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d not found: ID does not exist" containerID="8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.388392 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d"} err="failed to get container status \"8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d\": rpc error: code = NotFound desc = could not find container \"8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d\": container with ID starting with 8f34fe59339a2072a730832f50bd62453a0d0ef1ad303dbcbb364d5ce3b9734d not found: ID does not exist" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.388436 4839 scope.go:117] "RemoveContainer" containerID="79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb" Mar 21 04:48:31 crc kubenswrapper[4839]: E0321 04:48:31.389142 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb\": container with ID starting with 79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb not found: ID does not exist" containerID="79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.389185 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb"} err="failed to get container status \"79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb\": rpc error: code = NotFound desc = could not find container \"79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb\": container with ID starting with 79758e53a6a4afc87d273da5e0a098ac34797242bb0be0e09b4fef2bf01960eb not found: ID does not exist" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.389214 4839 scope.go:117] "RemoveContainer" containerID="acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b" Mar 21 04:48:31 crc kubenswrapper[4839]: E0321 04:48:31.389726 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b\": container with ID starting with acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b not found: ID does not exist" containerID="acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b" Mar 21 04:48:31 crc kubenswrapper[4839]: I0321 04:48:31.389754 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b"} err="failed to get container status \"acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b\": rpc error: code = NotFound desc = could not find container \"acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b\": container with ID starting with acab5d81cd2e1c37c1a56acb9fa01bf1a93058aa2c338773bcb2ab0c75d8fc2b not found: ID does not exist" Mar 21 04:48:32 crc kubenswrapper[4839]: I0321 04:48:32.463807 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="346daec7-d0f8-4237-a189-2b84c2a65207" path="/var/lib/kubelet/pods/346daec7-d0f8-4237-a189-2b84c2a65207/volumes" Mar 21 04:48:33 crc kubenswrapper[4839]: I0321 04:48:33.543772 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78c64bc9c5-n4nl2" Mar 21 04:48:33 crc kubenswrapper[4839]: I0321 04:48:33.598638 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-cg5zx"] Mar 21 04:48:33 crc kubenswrapper[4839]: I0321 04:48:33.598904 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" podUID="fa3a88fe-e92a-48a2-9d53-97c2e2c16407" containerName="dnsmasq-dns" containerID="cri-o://7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545" gracePeriod=10 Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.063150 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.208423 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-sb\") pod \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.208521 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-swift-storage-0\") pod \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.208601 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-nb\") pod \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.208627 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-config\") pod \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.208738 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-openstack-edpm-ipam\") pod \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.208798 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tpcn\" (UniqueName: \"kubernetes.io/projected/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-kube-api-access-9tpcn\") pod \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.208903 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-svc\") pod \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\" (UID: \"fa3a88fe-e92a-48a2-9d53-97c2e2c16407\") " Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.255087 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-kube-api-access-9tpcn" (OuterVolumeSpecName: "kube-api-access-9tpcn") pod "fa3a88fe-e92a-48a2-9d53-97c2e2c16407" (UID: "fa3a88fe-e92a-48a2-9d53-97c2e2c16407"). InnerVolumeSpecName "kube-api-access-9tpcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.289389 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fa3a88fe-e92a-48a2-9d53-97c2e2c16407" (UID: "fa3a88fe-e92a-48a2-9d53-97c2e2c16407"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.292260 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fa3a88fe-e92a-48a2-9d53-97c2e2c16407" (UID: "fa3a88fe-e92a-48a2-9d53-97c2e2c16407"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.301128 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fa3a88fe-e92a-48a2-9d53-97c2e2c16407" (UID: "fa3a88fe-e92a-48a2-9d53-97c2e2c16407"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.307464 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "fa3a88fe-e92a-48a2-9d53-97c2e2c16407" (UID: "fa3a88fe-e92a-48a2-9d53-97c2e2c16407"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.311410 4839 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.311443 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.311456 4839 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.311466 4839 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.311474 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tpcn\" (UniqueName: \"kubernetes.io/projected/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-kube-api-access-9tpcn\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.317169 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-config" (OuterVolumeSpecName: "config") pod "fa3a88fe-e92a-48a2-9d53-97c2e2c16407" (UID: "fa3a88fe-e92a-48a2-9d53-97c2e2c16407"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.318806 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fa3a88fe-e92a-48a2-9d53-97c2e2c16407" (UID: "fa3a88fe-e92a-48a2-9d53-97c2e2c16407"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.337259 4839 generic.go:334] "Generic (PLEG): container finished" podID="fa3a88fe-e92a-48a2-9d53-97c2e2c16407" containerID="7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545" exitCode=0 Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.337358 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.337386 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" event={"ID":"fa3a88fe-e92a-48a2-9d53-97c2e2c16407","Type":"ContainerDied","Data":"7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545"} Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.337696 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-cg5zx" event={"ID":"fa3a88fe-e92a-48a2-9d53-97c2e2c16407","Type":"ContainerDied","Data":"6afe4a7a44e7d79961cd510bbe96aaeab650fa9c2ae4dcc1d6d1067536db5cf3"} Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.337742 4839 scope.go:117] "RemoveContainer" containerID="7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.382046 4839 scope.go:117] "RemoveContainer" containerID="03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.390235 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-cg5zx"] Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.400194 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-cg5zx"] Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.414114 4839 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.414197 4839 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa3a88fe-e92a-48a2-9d53-97c2e2c16407-config\") on node \"crc\" DevicePath \"\"" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.419433 4839 scope.go:117] "RemoveContainer" containerID="7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545" Mar 21 04:48:34 crc kubenswrapper[4839]: E0321 04:48:34.420029 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545\": container with ID starting with 7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545 not found: ID does not exist" containerID="7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.420094 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545"} err="failed to get container status \"7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545\": rpc error: code = NotFound desc = could not find container \"7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545\": container with ID starting with 7e3932a251fb5ea95be45f70c482ec5596ce908fd935cd3f361d8e3a21897545 not found: ID does not exist" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.420115 4839 scope.go:117] "RemoveContainer" containerID="03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc" Mar 21 04:48:34 crc kubenswrapper[4839]: E0321 04:48:34.422031 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc\": container with ID starting with 03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc not found: ID does not exist" containerID="03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.422085 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc"} err="failed to get container status \"03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc\": rpc error: code = NotFound desc = could not find container \"03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc\": container with ID starting with 03d38a8e228a32b6d2c321880966d58bdb028deb1a7c9e33a2a7cd25a69ed0dc not found: ID does not exist" Mar 21 04:48:34 crc kubenswrapper[4839]: I0321 04:48:34.466242 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa3a88fe-e92a-48a2-9d53-97c2e2c16407" path="/var/lib/kubelet/pods/fa3a88fe-e92a-48a2-9d53-97c2e2c16407/volumes" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.302524 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq"] Mar 21 04:48:42 crc kubenswrapper[4839]: E0321 04:48:42.305099 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3a88fe-e92a-48a2-9d53-97c2e2c16407" containerName="dnsmasq-dns" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.305132 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3a88fe-e92a-48a2-9d53-97c2e2c16407" containerName="dnsmasq-dns" Mar 21 04:48:42 crc kubenswrapper[4839]: E0321 04:48:42.305158 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3a88fe-e92a-48a2-9d53-97c2e2c16407" containerName="init" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.305166 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3a88fe-e92a-48a2-9d53-97c2e2c16407" containerName="init" Mar 21 04:48:42 crc kubenswrapper[4839]: E0321 04:48:42.305188 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346daec7-d0f8-4237-a189-2b84c2a65207" containerName="extract-utilities" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.305196 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="346daec7-d0f8-4237-a189-2b84c2a65207" containerName="extract-utilities" Mar 21 04:48:42 crc kubenswrapper[4839]: E0321 04:48:42.305222 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346daec7-d0f8-4237-a189-2b84c2a65207" containerName="extract-content" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.305230 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="346daec7-d0f8-4237-a189-2b84c2a65207" containerName="extract-content" Mar 21 04:48:42 crc kubenswrapper[4839]: E0321 04:48:42.305272 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b06ab0-2209-4fb3-a837-ec755b412525" containerName="init" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.305280 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b06ab0-2209-4fb3-a837-ec755b412525" containerName="init" Mar 21 04:48:42 crc kubenswrapper[4839]: E0321 04:48:42.305288 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b06ab0-2209-4fb3-a837-ec755b412525" containerName="dnsmasq-dns" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.305295 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b06ab0-2209-4fb3-a837-ec755b412525" containerName="dnsmasq-dns" Mar 21 04:48:42 crc kubenswrapper[4839]: E0321 04:48:42.305320 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346daec7-d0f8-4237-a189-2b84c2a65207" containerName="registry-server" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.305334 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="346daec7-d0f8-4237-a189-2b84c2a65207" containerName="registry-server" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.307017 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3a88fe-e92a-48a2-9d53-97c2e2c16407" containerName="dnsmasq-dns" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.307072 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="346daec7-d0f8-4237-a189-2b84c2a65207" containerName="registry-server" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.307115 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0b06ab0-2209-4fb3-a837-ec755b412525" containerName="dnsmasq-dns" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.309183 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.321846 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.321964 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.322169 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.322397 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.350418 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq"] Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.469636 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.470527 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.470984 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmc9j\" (UniqueName: \"kubernetes.io/projected/acb0bb61-c53a-4171-bca5-4a3141d6904a-kube-api-access-qmc9j\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.471060 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.572930 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.572996 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.573106 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmc9j\" (UniqueName: \"kubernetes.io/projected/acb0bb61-c53a-4171-bca5-4a3141d6904a-kube-api-access-qmc9j\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.573158 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.584435 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.584480 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.586400 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.594653 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmc9j\" (UniqueName: \"kubernetes.io/projected/acb0bb61-c53a-4171-bca5-4a3141d6904a-kube-api-access-qmc9j\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:42 crc kubenswrapper[4839]: I0321 04:48:42.654531 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:48:43 crc kubenswrapper[4839]: I0321 04:48:43.254452 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq"] Mar 21 04:48:43 crc kubenswrapper[4839]: W0321 04:48:43.256274 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacb0bb61_c53a_4171_bca5_4a3141d6904a.slice/crio-09229b426fbce2a8b2fcac2d33b487a20a81e667d376ebfa1c1a2ccc2f1ace9f WatchSource:0}: Error finding container 09229b426fbce2a8b2fcac2d33b487a20a81e667d376ebfa1c1a2ccc2f1ace9f: Status 404 returned error can't find the container with id 09229b426fbce2a8b2fcac2d33b487a20a81e667d376ebfa1c1a2ccc2f1ace9f Mar 21 04:48:43 crc kubenswrapper[4839]: I0321 04:48:43.427711 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" event={"ID":"acb0bb61-c53a-4171-bca5-4a3141d6904a","Type":"ContainerStarted","Data":"09229b426fbce2a8b2fcac2d33b487a20a81e667d376ebfa1c1a2ccc2f1ace9f"} Mar 21 04:48:46 crc kubenswrapper[4839]: I0321 04:48:46.466124 4839 generic.go:334] "Generic (PLEG): container finished" podID="bfff67da-8ea4-4798-9b8d-58a3abac4347" containerID="644c67f4886f966671f8710482c7d08251548f19faca3155012ce9a9e6664332" exitCode=0 Mar 21 04:48:46 crc kubenswrapper[4839]: I0321 04:48:46.472133 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bfff67da-8ea4-4798-9b8d-58a3abac4347","Type":"ContainerDied","Data":"644c67f4886f966671f8710482c7d08251548f19faca3155012ce9a9e6664332"} Mar 21 04:48:47 crc kubenswrapper[4839]: I0321 04:48:47.485802 4839 generic.go:334] "Generic (PLEG): container finished" podID="fa82c4a0-2b0e-4e22-9e91-7fc899122414" containerID="7ac821c6a3f3ad08d86993b9d344baebd30c5481183b9c62427b46b6443230a7" exitCode=0 Mar 21 04:48:47 crc kubenswrapper[4839]: I0321 04:48:47.485886 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fa82c4a0-2b0e-4e22-9e91-7fc899122414","Type":"ContainerDied","Data":"7ac821c6a3f3ad08d86993b9d344baebd30c5481183b9c62427b46b6443230a7"} Mar 21 04:48:47 crc kubenswrapper[4839]: I0321 04:48:47.488960 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bfff67da-8ea4-4798-9b8d-58a3abac4347","Type":"ContainerStarted","Data":"8ce022af499ca6abdf4830b373ebbc4fd26f136d225297b83b854a27500743cb"} Mar 21 04:48:47 crc kubenswrapper[4839]: I0321 04:48:47.489939 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 21 04:48:47 crc kubenswrapper[4839]: I0321 04:48:47.542259 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.542241233 podStartE2EDuration="36.542241233s" podCreationTimestamp="2026-03-21 04:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:48:47.540871725 +0000 UTC m=+1531.868658401" watchObservedRunningTime="2026-03-21 04:48:47.542241233 +0000 UTC m=+1531.870027909" Mar 21 04:48:48 crc kubenswrapper[4839]: I0321 04:48:48.132494 4839 scope.go:117] "RemoveContainer" containerID="13a62d6a43116fe61b0ca05db07b93400dd1b1d3d2760f545556037c6e4992fd" Mar 21 04:48:53 crc kubenswrapper[4839]: I0321 04:48:53.556472 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" event={"ID":"acb0bb61-c53a-4171-bca5-4a3141d6904a","Type":"ContainerStarted","Data":"b92ce422c327813606bddc64da9b54ef291ca71f42d2c5fa25cbac824f384d46"} Mar 21 04:48:53 crc kubenswrapper[4839]: I0321 04:48:53.560781 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fa82c4a0-2b0e-4e22-9e91-7fc899122414","Type":"ContainerStarted","Data":"3f89c476ec85132bd8e8dfe48defcb72b9dbc1aaae748a5f27ad4343af433ead"} Mar 21 04:48:53 crc kubenswrapper[4839]: I0321 04:48:53.561180 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:48:53 crc kubenswrapper[4839]: I0321 04:48:53.583887 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" podStartSLOduration=1.764881259 podStartE2EDuration="11.583860913s" podCreationTimestamp="2026-03-21 04:48:42 +0000 UTC" firstStartedPulling="2026-03-21 04:48:43.259912685 +0000 UTC m=+1527.587699361" lastFinishedPulling="2026-03-21 04:48:53.078892339 +0000 UTC m=+1537.406679015" observedRunningTime="2026-03-21 04:48:53.571579418 +0000 UTC m=+1537.899366104" watchObservedRunningTime="2026-03-21 04:48:53.583860913 +0000 UTC m=+1537.911647589" Mar 21 04:48:53 crc kubenswrapper[4839]: I0321 04:48:53.603684 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=41.603663898 podStartE2EDuration="41.603663898s" podCreationTimestamp="2026-03-21 04:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 04:48:53.597718722 +0000 UTC m=+1537.925505418" watchObservedRunningTime="2026-03-21 04:48:53.603663898 +0000 UTC m=+1537.931450574" Mar 21 04:49:00 crc kubenswrapper[4839]: I0321 04:49:00.980816 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:49:00 crc kubenswrapper[4839]: I0321 04:49:00.981371 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:49:01 crc kubenswrapper[4839]: I0321 04:49:01.468329 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 21 04:49:06 crc kubenswrapper[4839]: I0321 04:49:06.688719 4839 generic.go:334] "Generic (PLEG): container finished" podID="acb0bb61-c53a-4171-bca5-4a3141d6904a" containerID="b92ce422c327813606bddc64da9b54ef291ca71f42d2c5fa25cbac824f384d46" exitCode=0 Mar 21 04:49:06 crc kubenswrapper[4839]: I0321 04:49:06.688804 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" event={"ID":"acb0bb61-c53a-4171-bca5-4a3141d6904a","Type":"ContainerDied","Data":"b92ce422c327813606bddc64da9b54ef291ca71f42d2c5fa25cbac824f384d46"} Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.379901 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.474277 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-inventory\") pod \"acb0bb61-c53a-4171-bca5-4a3141d6904a\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.474366 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmc9j\" (UniqueName: \"kubernetes.io/projected/acb0bb61-c53a-4171-bca5-4a3141d6904a-kube-api-access-qmc9j\") pod \"acb0bb61-c53a-4171-bca5-4a3141d6904a\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.476113 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-ssh-key-openstack-edpm-ipam\") pod \"acb0bb61-c53a-4171-bca5-4a3141d6904a\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.476250 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-repo-setup-combined-ca-bundle\") pod \"acb0bb61-c53a-4171-bca5-4a3141d6904a\" (UID: \"acb0bb61-c53a-4171-bca5-4a3141d6904a\") " Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.487418 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "acb0bb61-c53a-4171-bca5-4a3141d6904a" (UID: "acb0bb61-c53a-4171-bca5-4a3141d6904a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.487731 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acb0bb61-c53a-4171-bca5-4a3141d6904a-kube-api-access-qmc9j" (OuterVolumeSpecName: "kube-api-access-qmc9j") pod "acb0bb61-c53a-4171-bca5-4a3141d6904a" (UID: "acb0bb61-c53a-4171-bca5-4a3141d6904a"). InnerVolumeSpecName "kube-api-access-qmc9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.504846 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-inventory" (OuterVolumeSpecName: "inventory") pod "acb0bb61-c53a-4171-bca5-4a3141d6904a" (UID: "acb0bb61-c53a-4171-bca5-4a3141d6904a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.531405 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "acb0bb61-c53a-4171-bca5-4a3141d6904a" (UID: "acb0bb61-c53a-4171-bca5-4a3141d6904a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.578972 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.579028 4839 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.579042 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acb0bb61-c53a-4171-bca5-4a3141d6904a-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.579054 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmc9j\" (UniqueName: \"kubernetes.io/projected/acb0bb61-c53a-4171-bca5-4a3141d6904a-kube-api-access-qmc9j\") on node \"crc\" DevicePath \"\"" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.709538 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" event={"ID":"acb0bb61-c53a-4171-bca5-4a3141d6904a","Type":"ContainerDied","Data":"09229b426fbce2a8b2fcac2d33b487a20a81e667d376ebfa1c1a2ccc2f1ace9f"} Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.710042 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09229b426fbce2a8b2fcac2d33b487a20a81e667d376ebfa1c1a2ccc2f1ace9f" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.709653 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.782607 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn"] Mar 21 04:49:08 crc kubenswrapper[4839]: E0321 04:49:08.782994 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb0bb61-c53a-4171-bca5-4a3141d6904a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.783011 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb0bb61-c53a-4171-bca5-4a3141d6904a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.783208 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb0bb61-c53a-4171-bca5-4a3141d6904a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.783799 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.788006 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.788051 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.788303 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.788438 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.804419 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn"] Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.884053 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pgfnn\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.884451 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7w86\" (UniqueName: \"kubernetes.io/projected/a6dd2bff-543f-4ebb-b908-3e528f322548-kube-api-access-r7w86\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pgfnn\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.884480 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pgfnn\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.986001 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pgfnn\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.986073 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7w86\" (UniqueName: \"kubernetes.io/projected/a6dd2bff-543f-4ebb-b908-3e528f322548-kube-api-access-r7w86\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pgfnn\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.986095 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pgfnn\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.990663 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pgfnn\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:08 crc kubenswrapper[4839]: I0321 04:49:08.990677 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pgfnn\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:09 crc kubenswrapper[4839]: I0321 04:49:09.003489 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7w86\" (UniqueName: \"kubernetes.io/projected/a6dd2bff-543f-4ebb-b908-3e528f322548-kube-api-access-r7w86\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pgfnn\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:09 crc kubenswrapper[4839]: I0321 04:49:09.120715 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:09 crc kubenswrapper[4839]: I0321 04:49:09.634539 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn"] Mar 21 04:49:09 crc kubenswrapper[4839]: W0321 04:49:09.635947 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6dd2bff_543f_4ebb_b908_3e528f322548.slice/crio-5c63612b55b73f4c3891be895eb45551655a6f1cd2f188f983a2ac1052418cdd WatchSource:0}: Error finding container 5c63612b55b73f4c3891be895eb45551655a6f1cd2f188f983a2ac1052418cdd: Status 404 returned error can't find the container with id 5c63612b55b73f4c3891be895eb45551655a6f1cd2f188f983a2ac1052418cdd Mar 21 04:49:09 crc kubenswrapper[4839]: I0321 04:49:09.718370 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" event={"ID":"a6dd2bff-543f-4ebb-b908-3e528f322548","Type":"ContainerStarted","Data":"5c63612b55b73f4c3891be895eb45551655a6f1cd2f188f983a2ac1052418cdd"} Mar 21 04:49:12 crc kubenswrapper[4839]: I0321 04:49:12.623775 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 21 04:49:12 crc kubenswrapper[4839]: I0321 04:49:12.759050 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" event={"ID":"a6dd2bff-543f-4ebb-b908-3e528f322548","Type":"ContainerStarted","Data":"dffad1f751349d3858c08b824a236eeafa25bd177c267d117a7cc25a3bfdb104"} Mar 21 04:49:12 crc kubenswrapper[4839]: I0321 04:49:12.783356 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" podStartSLOduration=2.746321885 podStartE2EDuration="4.783334881s" podCreationTimestamp="2026-03-21 04:49:08 +0000 UTC" firstStartedPulling="2026-03-21 04:49:09.638802129 +0000 UTC m=+1553.966588805" lastFinishedPulling="2026-03-21 04:49:11.675815125 +0000 UTC m=+1556.003601801" observedRunningTime="2026-03-21 04:49:12.774441232 +0000 UTC m=+1557.102227918" watchObservedRunningTime="2026-03-21 04:49:12.783334881 +0000 UTC m=+1557.111121567" Mar 21 04:49:14 crc kubenswrapper[4839]: I0321 04:49:14.780648 4839 generic.go:334] "Generic (PLEG): container finished" podID="a6dd2bff-543f-4ebb-b908-3e528f322548" containerID="dffad1f751349d3858c08b824a236eeafa25bd177c267d117a7cc25a3bfdb104" exitCode=0 Mar 21 04:49:14 crc kubenswrapper[4839]: I0321 04:49:14.780724 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" event={"ID":"a6dd2bff-543f-4ebb-b908-3e528f322548","Type":"ContainerDied","Data":"dffad1f751349d3858c08b824a236eeafa25bd177c267d117a7cc25a3bfdb104"} Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.243098 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.325100 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-inventory\") pod \"a6dd2bff-543f-4ebb-b908-3e528f322548\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.325196 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7w86\" (UniqueName: \"kubernetes.io/projected/a6dd2bff-543f-4ebb-b908-3e528f322548-kube-api-access-r7w86\") pod \"a6dd2bff-543f-4ebb-b908-3e528f322548\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.325260 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-ssh-key-openstack-edpm-ipam\") pod \"a6dd2bff-543f-4ebb-b908-3e528f322548\" (UID: \"a6dd2bff-543f-4ebb-b908-3e528f322548\") " Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.331967 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6dd2bff-543f-4ebb-b908-3e528f322548-kube-api-access-r7w86" (OuterVolumeSpecName: "kube-api-access-r7w86") pod "a6dd2bff-543f-4ebb-b908-3e528f322548" (UID: "a6dd2bff-543f-4ebb-b908-3e528f322548"). InnerVolumeSpecName "kube-api-access-r7w86". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.357394 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a6dd2bff-543f-4ebb-b908-3e528f322548" (UID: "a6dd2bff-543f-4ebb-b908-3e528f322548"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.373997 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-inventory" (OuterVolumeSpecName: "inventory") pod "a6dd2bff-543f-4ebb-b908-3e528f322548" (UID: "a6dd2bff-543f-4ebb-b908-3e528f322548"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.428231 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.428287 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7w86\" (UniqueName: \"kubernetes.io/projected/a6dd2bff-543f-4ebb-b908-3e528f322548-kube-api-access-r7w86\") on node \"crc\" DevicePath \"\"" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.428302 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6dd2bff-543f-4ebb-b908-3e528f322548-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.804441 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" event={"ID":"a6dd2bff-543f-4ebb-b908-3e528f322548","Type":"ContainerDied","Data":"5c63612b55b73f4c3891be895eb45551655a6f1cd2f188f983a2ac1052418cdd"} Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.804811 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c63612b55b73f4c3891be895eb45551655a6f1cd2f188f983a2ac1052418cdd" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.804635 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pgfnn" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.869964 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz"] Mar 21 04:49:16 crc kubenswrapper[4839]: E0321 04:49:16.870413 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6dd2bff-543f-4ebb-b908-3e528f322548" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.870441 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6dd2bff-543f-4ebb-b908-3e528f322548" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.870905 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6dd2bff-543f-4ebb-b908-3e528f322548" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.871505 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.879157 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.879297 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.879346 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.879493 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.883799 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz"] Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.938405 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.938487 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjf5g\" (UniqueName: \"kubernetes.io/projected/a1d76458-d587-4960-9bcc-7e3d3122b44d-kube-api-access-zjf5g\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.938554 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:16 crc kubenswrapper[4839]: I0321 04:49:16.938629 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.039956 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjf5g\" (UniqueName: \"kubernetes.io/projected/a1d76458-d587-4960-9bcc-7e3d3122b44d-kube-api-access-zjf5g\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.040032 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.040065 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.040159 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.045182 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.045335 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.045806 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.056963 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjf5g\" (UniqueName: \"kubernetes.io/projected/a1d76458-d587-4960-9bcc-7e3d3122b44d-kube-api-access-zjf5g\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.191780 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.792679 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz"] Mar 21 04:49:17 crc kubenswrapper[4839]: W0321 04:49:17.793479 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1d76458_d587_4960_9bcc_7e3d3122b44d.slice/crio-822a23cdb1c22865313c8050c3f022e1750a4731cc96eac72c126accbbe28877 WatchSource:0}: Error finding container 822a23cdb1c22865313c8050c3f022e1750a4731cc96eac72c126accbbe28877: Status 404 returned error can't find the container with id 822a23cdb1c22865313c8050c3f022e1750a4731cc96eac72c126accbbe28877 Mar 21 04:49:17 crc kubenswrapper[4839]: I0321 04:49:17.816127 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" event={"ID":"a1d76458-d587-4960-9bcc-7e3d3122b44d","Type":"ContainerStarted","Data":"822a23cdb1c22865313c8050c3f022e1750a4731cc96eac72c126accbbe28877"} Mar 21 04:49:18 crc kubenswrapper[4839]: I0321 04:49:18.825524 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" event={"ID":"a1d76458-d587-4960-9bcc-7e3d3122b44d","Type":"ContainerStarted","Data":"70456d0f0e0073bde0ceeec7013aa756cec85385ed747be87d7d96cfa8d04986"} Mar 21 04:49:18 crc kubenswrapper[4839]: I0321 04:49:18.852865 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" podStartSLOduration=2.36990932 podStartE2EDuration="2.852841235s" podCreationTimestamp="2026-03-21 04:49:16 +0000 UTC" firstStartedPulling="2026-03-21 04:49:17.7984729 +0000 UTC m=+1562.126259576" lastFinishedPulling="2026-03-21 04:49:18.281404815 +0000 UTC m=+1562.609191491" observedRunningTime="2026-03-21 04:49:18.8437631 +0000 UTC m=+1563.171549776" watchObservedRunningTime="2026-03-21 04:49:18.852841235 +0000 UTC m=+1563.180627911" Mar 21 04:49:30 crc kubenswrapper[4839]: I0321 04:49:30.980554 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:49:30 crc kubenswrapper[4839]: I0321 04:49:30.981470 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:49:30 crc kubenswrapper[4839]: I0321 04:49:30.981552 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:49:30 crc kubenswrapper[4839]: I0321 04:49:30.982678 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c031ed8f7b7576f57e9530a46687f2f2de2e5c2a62f42435eef393cfd7af2b37"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:49:30 crc kubenswrapper[4839]: I0321 04:49:30.982744 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://c031ed8f7b7576f57e9530a46687f2f2de2e5c2a62f42435eef393cfd7af2b37" gracePeriod=600 Mar 21 04:49:31 crc kubenswrapper[4839]: I0321 04:49:31.940647 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="c031ed8f7b7576f57e9530a46687f2f2de2e5c2a62f42435eef393cfd7af2b37" exitCode=0 Mar 21 04:49:31 crc kubenswrapper[4839]: I0321 04:49:31.940723 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"c031ed8f7b7576f57e9530a46687f2f2de2e5c2a62f42435eef393cfd7af2b37"} Mar 21 04:49:31 crc kubenswrapper[4839]: I0321 04:49:31.941000 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109"} Mar 21 04:49:31 crc kubenswrapper[4839]: I0321 04:49:31.941018 4839 scope.go:117] "RemoveContainer" containerID="48bb6d2443587cf3023178aa72ea424c113f55b1e7600821dbf21c214de8e70f" Mar 21 04:49:53 crc kubenswrapper[4839]: I0321 04:49:53.176967 4839 scope.go:117] "RemoveContainer" containerID="d792580397713b6021c551a3f4cbfaf97f1c5637484d37b25e33338bf6fc4ac7" Mar 21 04:49:53 crc kubenswrapper[4839]: I0321 04:49:53.208369 4839 scope.go:117] "RemoveContainer" containerID="4f187e9e33fa923f2b2629c019ef104918ae6112912ac0480384a8c6a651a762" Mar 21 04:49:53 crc kubenswrapper[4839]: I0321 04:49:53.252064 4839 scope.go:117] "RemoveContainer" containerID="6b5e6693316b5cfa06c0f6c8e7e9f37a0398c873966489b56f00cbd44f60fd16" Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.138493 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567810-hr2wf"] Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.140337 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567810-hr2wf" Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.143321 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.143331 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.143325 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.151785 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567810-hr2wf"] Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.307869 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp27f\" (UniqueName: \"kubernetes.io/projected/cf9d6591-e9e7-485d-96f3-8f32958ac530-kube-api-access-qp27f\") pod \"auto-csr-approver-29567810-hr2wf\" (UID: \"cf9d6591-e9e7-485d-96f3-8f32958ac530\") " pod="openshift-infra/auto-csr-approver-29567810-hr2wf" Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.410510 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp27f\" (UniqueName: \"kubernetes.io/projected/cf9d6591-e9e7-485d-96f3-8f32958ac530-kube-api-access-qp27f\") pod \"auto-csr-approver-29567810-hr2wf\" (UID: \"cf9d6591-e9e7-485d-96f3-8f32958ac530\") " pod="openshift-infra/auto-csr-approver-29567810-hr2wf" Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.429147 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp27f\" (UniqueName: \"kubernetes.io/projected/cf9d6591-e9e7-485d-96f3-8f32958ac530-kube-api-access-qp27f\") pod \"auto-csr-approver-29567810-hr2wf\" (UID: \"cf9d6591-e9e7-485d-96f3-8f32958ac530\") " pod="openshift-infra/auto-csr-approver-29567810-hr2wf" Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.459603 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567810-hr2wf" Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.885241 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567810-hr2wf"] Mar 21 04:50:00 crc kubenswrapper[4839]: I0321 04:50:00.895852 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 04:50:01 crc kubenswrapper[4839]: I0321 04:50:01.250771 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567810-hr2wf" event={"ID":"cf9d6591-e9e7-485d-96f3-8f32958ac530","Type":"ContainerStarted","Data":"f6e573e57464ee6a595844a7133ae0d4c7c7b617bdc87105aee518eda761c6ca"} Mar 21 04:50:03 crc kubenswrapper[4839]: I0321 04:50:03.271855 4839 generic.go:334] "Generic (PLEG): container finished" podID="cf9d6591-e9e7-485d-96f3-8f32958ac530" containerID="3f39162e6963343de8c3eafe8a89ac888be7f9493499afd89bf8375748fc8e0f" exitCode=0 Mar 21 04:50:03 crc kubenswrapper[4839]: I0321 04:50:03.272005 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567810-hr2wf" event={"ID":"cf9d6591-e9e7-485d-96f3-8f32958ac530","Type":"ContainerDied","Data":"3f39162e6963343de8c3eafe8a89ac888be7f9493499afd89bf8375748fc8e0f"} Mar 21 04:50:04 crc kubenswrapper[4839]: I0321 04:50:04.629518 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567810-hr2wf" Mar 21 04:50:04 crc kubenswrapper[4839]: I0321 04:50:04.794418 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp27f\" (UniqueName: \"kubernetes.io/projected/cf9d6591-e9e7-485d-96f3-8f32958ac530-kube-api-access-qp27f\") pod \"cf9d6591-e9e7-485d-96f3-8f32958ac530\" (UID: \"cf9d6591-e9e7-485d-96f3-8f32958ac530\") " Mar 21 04:50:04 crc kubenswrapper[4839]: I0321 04:50:04.803368 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf9d6591-e9e7-485d-96f3-8f32958ac530-kube-api-access-qp27f" (OuterVolumeSpecName: "kube-api-access-qp27f") pod "cf9d6591-e9e7-485d-96f3-8f32958ac530" (UID: "cf9d6591-e9e7-485d-96f3-8f32958ac530"). InnerVolumeSpecName "kube-api-access-qp27f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:50:04 crc kubenswrapper[4839]: I0321 04:50:04.897765 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp27f\" (UniqueName: \"kubernetes.io/projected/cf9d6591-e9e7-485d-96f3-8f32958ac530-kube-api-access-qp27f\") on node \"crc\" DevicePath \"\"" Mar 21 04:50:05 crc kubenswrapper[4839]: I0321 04:50:05.295310 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567810-hr2wf" event={"ID":"cf9d6591-e9e7-485d-96f3-8f32958ac530","Type":"ContainerDied","Data":"f6e573e57464ee6a595844a7133ae0d4c7c7b617bdc87105aee518eda761c6ca"} Mar 21 04:50:05 crc kubenswrapper[4839]: I0321 04:50:05.295358 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6e573e57464ee6a595844a7133ae0d4c7c7b617bdc87105aee518eda761c6ca" Mar 21 04:50:05 crc kubenswrapper[4839]: I0321 04:50:05.295393 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567810-hr2wf" Mar 21 04:50:05 crc kubenswrapper[4839]: I0321 04:50:05.721195 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567804-jn7hw"] Mar 21 04:50:05 crc kubenswrapper[4839]: I0321 04:50:05.730056 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567804-jn7hw"] Mar 21 04:50:06 crc kubenswrapper[4839]: I0321 04:50:06.464444 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="117f0438-5ab3-4616-b574-c5bbc43e8ac9" path="/var/lib/kubelet/pods/117f0438-5ab3-4616-b574-c5bbc43e8ac9/volumes" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.047507 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wpn7f"] Mar 21 04:50:31 crc kubenswrapper[4839]: E0321 04:50:31.048503 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf9d6591-e9e7-485d-96f3-8f32958ac530" containerName="oc" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.048518 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf9d6591-e9e7-485d-96f3-8f32958ac530" containerName="oc" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.048699 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf9d6591-e9e7-485d-96f3-8f32958ac530" containerName="oc" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.050116 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.058802 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wpn7f"] Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.197481 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-catalog-content\") pod \"community-operators-wpn7f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.197736 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-utilities\") pod \"community-operators-wpn7f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.197772 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz2cp\" (UniqueName: \"kubernetes.io/projected/919b184f-e6e2-4633-aad8-37bbe3fa579f-kube-api-access-pz2cp\") pod \"community-operators-wpn7f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.298933 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-utilities\") pod \"community-operators-wpn7f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.298986 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz2cp\" (UniqueName: \"kubernetes.io/projected/919b184f-e6e2-4633-aad8-37bbe3fa579f-kube-api-access-pz2cp\") pod \"community-operators-wpn7f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.299086 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-catalog-content\") pod \"community-operators-wpn7f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.299635 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-utilities\") pod \"community-operators-wpn7f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.299666 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-catalog-content\") pod \"community-operators-wpn7f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.320352 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz2cp\" (UniqueName: \"kubernetes.io/projected/919b184f-e6e2-4633-aad8-37bbe3fa579f-kube-api-access-pz2cp\") pod \"community-operators-wpn7f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:31 crc kubenswrapper[4839]: I0321 04:50:31.383063 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:32 crc kubenswrapper[4839]: I0321 04:50:32.009348 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wpn7f"] Mar 21 04:50:32 crc kubenswrapper[4839]: I0321 04:50:32.549767 4839 generic.go:334] "Generic (PLEG): container finished" podID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerID="d51b8301c3c8bb03eba26fad356fbcd7318953d4f1b3376abf01e815d6509cf1" exitCode=0 Mar 21 04:50:32 crc kubenswrapper[4839]: I0321 04:50:32.550073 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wpn7f" event={"ID":"919b184f-e6e2-4633-aad8-37bbe3fa579f","Type":"ContainerDied","Data":"d51b8301c3c8bb03eba26fad356fbcd7318953d4f1b3376abf01e815d6509cf1"} Mar 21 04:50:32 crc kubenswrapper[4839]: I0321 04:50:32.550104 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wpn7f" event={"ID":"919b184f-e6e2-4633-aad8-37bbe3fa579f","Type":"ContainerStarted","Data":"3213ef092a94f429bd88322b83c5988943b92e89dae4bbca5523e68fbd740666"} Mar 21 04:50:33 crc kubenswrapper[4839]: I0321 04:50:33.562498 4839 generic.go:334] "Generic (PLEG): container finished" podID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerID="2c82a777ff61fab81f58fb678dd999ca28aa8bd736bcf9c16dbd8bfc0f56d504" exitCode=0 Mar 21 04:50:33 crc kubenswrapper[4839]: I0321 04:50:33.562620 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wpn7f" event={"ID":"919b184f-e6e2-4633-aad8-37bbe3fa579f","Type":"ContainerDied","Data":"2c82a777ff61fab81f58fb678dd999ca28aa8bd736bcf9c16dbd8bfc0f56d504"} Mar 21 04:50:34 crc kubenswrapper[4839]: I0321 04:50:34.576010 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wpn7f" event={"ID":"919b184f-e6e2-4633-aad8-37bbe3fa579f","Type":"ContainerStarted","Data":"7cc590417128252053f41401db6235f13824720b19dbf6d99dc75ef1117fb028"} Mar 21 04:50:34 crc kubenswrapper[4839]: I0321 04:50:34.602134 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wpn7f" podStartSLOduration=2.175491382 podStartE2EDuration="3.602112258s" podCreationTimestamp="2026-03-21 04:50:31 +0000 UTC" firstStartedPulling="2026-03-21 04:50:32.551748969 +0000 UTC m=+1636.879535645" lastFinishedPulling="2026-03-21 04:50:33.978369845 +0000 UTC m=+1638.306156521" observedRunningTime="2026-03-21 04:50:34.600226655 +0000 UTC m=+1638.928013331" watchObservedRunningTime="2026-03-21 04:50:34.602112258 +0000 UTC m=+1638.929898924" Mar 21 04:50:41 crc kubenswrapper[4839]: I0321 04:50:41.384256 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:41 crc kubenswrapper[4839]: I0321 04:50:41.385014 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:41 crc kubenswrapper[4839]: I0321 04:50:41.441378 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:41 crc kubenswrapper[4839]: I0321 04:50:41.701831 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:41 crc kubenswrapper[4839]: I0321 04:50:41.752689 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wpn7f"] Mar 21 04:50:43 crc kubenswrapper[4839]: I0321 04:50:43.676379 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wpn7f" podUID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerName="registry-server" containerID="cri-o://7cc590417128252053f41401db6235f13824720b19dbf6d99dc75ef1117fb028" gracePeriod=2 Mar 21 04:50:44 crc kubenswrapper[4839]: I0321 04:50:44.729123 4839 generic.go:334] "Generic (PLEG): container finished" podID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerID="7cc590417128252053f41401db6235f13824720b19dbf6d99dc75ef1117fb028" exitCode=0 Mar 21 04:50:44 crc kubenswrapper[4839]: I0321 04:50:44.729209 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wpn7f" event={"ID":"919b184f-e6e2-4633-aad8-37bbe3fa579f","Type":"ContainerDied","Data":"7cc590417128252053f41401db6235f13824720b19dbf6d99dc75ef1117fb028"} Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.017901 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.168093 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz2cp\" (UniqueName: \"kubernetes.io/projected/919b184f-e6e2-4633-aad8-37bbe3fa579f-kube-api-access-pz2cp\") pod \"919b184f-e6e2-4633-aad8-37bbe3fa579f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.168453 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-utilities\") pod \"919b184f-e6e2-4633-aad8-37bbe3fa579f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.168736 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-catalog-content\") pod \"919b184f-e6e2-4633-aad8-37bbe3fa579f\" (UID: \"919b184f-e6e2-4633-aad8-37bbe3fa579f\") " Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.169556 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-utilities" (OuterVolumeSpecName: "utilities") pod "919b184f-e6e2-4633-aad8-37bbe3fa579f" (UID: "919b184f-e6e2-4633-aad8-37bbe3fa579f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.169722 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.177121 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919b184f-e6e2-4633-aad8-37bbe3fa579f-kube-api-access-pz2cp" (OuterVolumeSpecName: "kube-api-access-pz2cp") pod "919b184f-e6e2-4633-aad8-37bbe3fa579f" (UID: "919b184f-e6e2-4633-aad8-37bbe3fa579f"). InnerVolumeSpecName "kube-api-access-pz2cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.241174 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "919b184f-e6e2-4633-aad8-37bbe3fa579f" (UID: "919b184f-e6e2-4633-aad8-37bbe3fa579f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.272294 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919b184f-e6e2-4633-aad8-37bbe3fa579f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.272347 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz2cp\" (UniqueName: \"kubernetes.io/projected/919b184f-e6e2-4633-aad8-37bbe3fa579f-kube-api-access-pz2cp\") on node \"crc\" DevicePath \"\"" Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.745881 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wpn7f" event={"ID":"919b184f-e6e2-4633-aad8-37bbe3fa579f","Type":"ContainerDied","Data":"3213ef092a94f429bd88322b83c5988943b92e89dae4bbca5523e68fbd740666"} Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.745944 4839 scope.go:117] "RemoveContainer" containerID="7cc590417128252053f41401db6235f13824720b19dbf6d99dc75ef1117fb028" Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.746021 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wpn7f" Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.768133 4839 scope.go:117] "RemoveContainer" containerID="2c82a777ff61fab81f58fb678dd999ca28aa8bd736bcf9c16dbd8bfc0f56d504" Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.800591 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wpn7f"] Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.809500 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wpn7f"] Mar 21 04:50:45 crc kubenswrapper[4839]: I0321 04:50:45.822182 4839 scope.go:117] "RemoveContainer" containerID="d51b8301c3c8bb03eba26fad356fbcd7318953d4f1b3376abf01e815d6509cf1" Mar 21 04:50:46 crc kubenswrapper[4839]: I0321 04:50:46.464133 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="919b184f-e6e2-4633-aad8-37bbe3fa579f" path="/var/lib/kubelet/pods/919b184f-e6e2-4633-aad8-37bbe3fa579f/volumes" Mar 21 04:50:53 crc kubenswrapper[4839]: I0321 04:50:53.389953 4839 scope.go:117] "RemoveContainer" containerID="d6b588c7e0ea2083a499b261b5c79b627db47988e3cf9da2d927f03f127f5e76" Mar 21 04:50:53 crc kubenswrapper[4839]: I0321 04:50:53.416083 4839 scope.go:117] "RemoveContainer" containerID="d7d86bc6d96470a04c1fc681cf73561b422455dc884417b0677e9ae418f682f0" Mar 21 04:50:53 crc kubenswrapper[4839]: I0321 04:50:53.464725 4839 scope.go:117] "RemoveContainer" containerID="91c89b78e4a205a25af8a93dc758c0974e237fac7942a3cc2a1f6b03e61923e1" Mar 21 04:50:53 crc kubenswrapper[4839]: I0321 04:50:53.499782 4839 scope.go:117] "RemoveContainer" containerID="2233fa3f3ad560ada373befd98764d7c67680bcb094c6c63415e8ef4dc05b7f7" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.304253 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wnqhz"] Mar 21 04:50:59 crc kubenswrapper[4839]: E0321 04:50:59.305250 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerName="extract-content" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.305266 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerName="extract-content" Mar 21 04:50:59 crc kubenswrapper[4839]: E0321 04:50:59.305280 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerName="extract-utilities" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.305288 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerName="extract-utilities" Mar 21 04:50:59 crc kubenswrapper[4839]: E0321 04:50:59.305303 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerName="registry-server" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.305311 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerName="registry-server" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.305549 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="919b184f-e6e2-4633-aad8-37bbe3fa579f" containerName="registry-server" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.308405 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.321121 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wnqhz"] Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.412100 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxff7\" (UniqueName: \"kubernetes.io/projected/49f12ee9-da5c-44bf-aa45-02640350f0ea-kube-api-access-nxff7\") pod \"redhat-marketplace-wnqhz\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.412154 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-utilities\") pod \"redhat-marketplace-wnqhz\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.412213 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-catalog-content\") pod \"redhat-marketplace-wnqhz\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.514031 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxff7\" (UniqueName: \"kubernetes.io/projected/49f12ee9-da5c-44bf-aa45-02640350f0ea-kube-api-access-nxff7\") pod \"redhat-marketplace-wnqhz\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.514106 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-utilities\") pod \"redhat-marketplace-wnqhz\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.514176 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-catalog-content\") pod \"redhat-marketplace-wnqhz\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.515162 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-utilities\") pod \"redhat-marketplace-wnqhz\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.515224 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-catalog-content\") pod \"redhat-marketplace-wnqhz\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.539808 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxff7\" (UniqueName: \"kubernetes.io/projected/49f12ee9-da5c-44bf-aa45-02640350f0ea-kube-api-access-nxff7\") pod \"redhat-marketplace-wnqhz\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:50:59 crc kubenswrapper[4839]: I0321 04:50:59.631029 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:51:00 crc kubenswrapper[4839]: I0321 04:51:00.002157 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wnqhz"] Mar 21 04:51:00 crc kubenswrapper[4839]: I0321 04:51:00.886819 4839 generic.go:334] "Generic (PLEG): container finished" podID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerID="fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d" exitCode=0 Mar 21 04:51:00 crc kubenswrapper[4839]: I0321 04:51:00.887685 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnqhz" event={"ID":"49f12ee9-da5c-44bf-aa45-02640350f0ea","Type":"ContainerDied","Data":"fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d"} Mar 21 04:51:00 crc kubenswrapper[4839]: I0321 04:51:00.888336 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnqhz" event={"ID":"49f12ee9-da5c-44bf-aa45-02640350f0ea","Type":"ContainerStarted","Data":"76f9974553375f6f0030f46f723765a84752d0e71db2420884c6cbca60a091a3"} Mar 21 04:51:02 crc kubenswrapper[4839]: I0321 04:51:02.905333 4839 generic.go:334] "Generic (PLEG): container finished" podID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerID="e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef" exitCode=0 Mar 21 04:51:02 crc kubenswrapper[4839]: I0321 04:51:02.905443 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnqhz" event={"ID":"49f12ee9-da5c-44bf-aa45-02640350f0ea","Type":"ContainerDied","Data":"e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef"} Mar 21 04:51:04 crc kubenswrapper[4839]: I0321 04:51:04.926037 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnqhz" event={"ID":"49f12ee9-da5c-44bf-aa45-02640350f0ea","Type":"ContainerStarted","Data":"9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f"} Mar 21 04:51:04 crc kubenswrapper[4839]: I0321 04:51:04.943599 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wnqhz" podStartSLOduration=3.081683889 podStartE2EDuration="5.943581554s" podCreationTimestamp="2026-03-21 04:50:59 +0000 UTC" firstStartedPulling="2026-03-21 04:51:00.889637399 +0000 UTC m=+1665.217424075" lastFinishedPulling="2026-03-21 04:51:03.751535064 +0000 UTC m=+1668.079321740" observedRunningTime="2026-03-21 04:51:04.942896025 +0000 UTC m=+1669.270682731" watchObservedRunningTime="2026-03-21 04:51:04.943581554 +0000 UTC m=+1669.271368230" Mar 21 04:51:09 crc kubenswrapper[4839]: I0321 04:51:09.631463 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:51:09 crc kubenswrapper[4839]: I0321 04:51:09.632103 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:51:09 crc kubenswrapper[4839]: I0321 04:51:09.691471 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:51:10 crc kubenswrapper[4839]: I0321 04:51:10.004878 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:51:10 crc kubenswrapper[4839]: I0321 04:51:10.048673 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wnqhz"] Mar 21 04:51:11 crc kubenswrapper[4839]: I0321 04:51:11.979893 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wnqhz" podUID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerName="registry-server" containerID="cri-o://9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f" gracePeriod=2 Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.492046 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.570365 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-catalog-content\") pod \"49f12ee9-da5c-44bf-aa45-02640350f0ea\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.570557 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-utilities\") pod \"49f12ee9-da5c-44bf-aa45-02640350f0ea\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.570703 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxff7\" (UniqueName: \"kubernetes.io/projected/49f12ee9-da5c-44bf-aa45-02640350f0ea-kube-api-access-nxff7\") pod \"49f12ee9-da5c-44bf-aa45-02640350f0ea\" (UID: \"49f12ee9-da5c-44bf-aa45-02640350f0ea\") " Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.572304 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-utilities" (OuterVolumeSpecName: "utilities") pod "49f12ee9-da5c-44bf-aa45-02640350f0ea" (UID: "49f12ee9-da5c-44bf-aa45-02640350f0ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.578194 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f12ee9-da5c-44bf-aa45-02640350f0ea-kube-api-access-nxff7" (OuterVolumeSpecName: "kube-api-access-nxff7") pod "49f12ee9-da5c-44bf-aa45-02640350f0ea" (UID: "49f12ee9-da5c-44bf-aa45-02640350f0ea"). InnerVolumeSpecName "kube-api-access-nxff7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.610704 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49f12ee9-da5c-44bf-aa45-02640350f0ea" (UID: "49f12ee9-da5c-44bf-aa45-02640350f0ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.673366 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.673409 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxff7\" (UniqueName: \"kubernetes.io/projected/49f12ee9-da5c-44bf-aa45-02640350f0ea-kube-api-access-nxff7\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.673424 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49f12ee9-da5c-44bf-aa45-02640350f0ea-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.990329 4839 generic.go:334] "Generic (PLEG): container finished" podID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerID="9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f" exitCode=0 Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.990415 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wnqhz" Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.990414 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnqhz" event={"ID":"49f12ee9-da5c-44bf-aa45-02640350f0ea","Type":"ContainerDied","Data":"9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f"} Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.990800 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wnqhz" event={"ID":"49f12ee9-da5c-44bf-aa45-02640350f0ea","Type":"ContainerDied","Data":"76f9974553375f6f0030f46f723765a84752d0e71db2420884c6cbca60a091a3"} Mar 21 04:51:12 crc kubenswrapper[4839]: I0321 04:51:12.990821 4839 scope.go:117] "RemoveContainer" containerID="9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f" Mar 21 04:51:13 crc kubenswrapper[4839]: I0321 04:51:13.011705 4839 scope.go:117] "RemoveContainer" containerID="e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef" Mar 21 04:51:13 crc kubenswrapper[4839]: I0321 04:51:13.023346 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wnqhz"] Mar 21 04:51:13 crc kubenswrapper[4839]: I0321 04:51:13.032974 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wnqhz"] Mar 21 04:51:13 crc kubenswrapper[4839]: I0321 04:51:13.044680 4839 scope.go:117] "RemoveContainer" containerID="fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d" Mar 21 04:51:13 crc kubenswrapper[4839]: I0321 04:51:13.081652 4839 scope.go:117] "RemoveContainer" containerID="9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f" Mar 21 04:51:13 crc kubenswrapper[4839]: E0321 04:51:13.082067 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f\": container with ID starting with 9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f not found: ID does not exist" containerID="9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f" Mar 21 04:51:13 crc kubenswrapper[4839]: I0321 04:51:13.082109 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f"} err="failed to get container status \"9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f\": rpc error: code = NotFound desc = could not find container \"9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f\": container with ID starting with 9ac52a8b1e04099ab0e136f38457ebea6a79463830c190acd52c6c983da0849f not found: ID does not exist" Mar 21 04:51:13 crc kubenswrapper[4839]: I0321 04:51:13.082139 4839 scope.go:117] "RemoveContainer" containerID="e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef" Mar 21 04:51:13 crc kubenswrapper[4839]: E0321 04:51:13.082542 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef\": container with ID starting with e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef not found: ID does not exist" containerID="e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef" Mar 21 04:51:13 crc kubenswrapper[4839]: I0321 04:51:13.082620 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef"} err="failed to get container status \"e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef\": rpc error: code = NotFound desc = could not find container \"e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef\": container with ID starting with e0e3e0615c3fd03739091938ee80522f6b47d73a4bc35a5bcf07793354cb96ef not found: ID does not exist" Mar 21 04:51:13 crc kubenswrapper[4839]: I0321 04:51:13.082648 4839 scope.go:117] "RemoveContainer" containerID="fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d" Mar 21 04:51:13 crc kubenswrapper[4839]: E0321 04:51:13.083018 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d\": container with ID starting with fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d not found: ID does not exist" containerID="fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d" Mar 21 04:51:13 crc kubenswrapper[4839]: I0321 04:51:13.083046 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d"} err="failed to get container status \"fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d\": rpc error: code = NotFound desc = could not find container \"fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d\": container with ID starting with fcbda48bcced36b2ddee4b31a1c2e13b4e70ba09cb38e4a05e3e019e58b0a43d not found: ID does not exist" Mar 21 04:51:14 crc kubenswrapper[4839]: I0321 04:51:14.465189 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f12ee9-da5c-44bf-aa45-02640350f0ea" path="/var/lib/kubelet/pods/49f12ee9-da5c-44bf-aa45-02640350f0ea/volumes" Mar 21 04:51:39 crc kubenswrapper[4839]: I0321 04:51:39.988681 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f4lbn"] Mar 21 04:51:39 crc kubenswrapper[4839]: E0321 04:51:39.989884 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerName="registry-server" Mar 21 04:51:39 crc kubenswrapper[4839]: I0321 04:51:39.989901 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerName="registry-server" Mar 21 04:51:39 crc kubenswrapper[4839]: E0321 04:51:39.989937 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerName="extract-utilities" Mar 21 04:51:39 crc kubenswrapper[4839]: I0321 04:51:39.989946 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerName="extract-utilities" Mar 21 04:51:39 crc kubenswrapper[4839]: E0321 04:51:39.989963 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerName="extract-content" Mar 21 04:51:39 crc kubenswrapper[4839]: I0321 04:51:39.989971 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerName="extract-content" Mar 21 04:51:39 crc kubenswrapper[4839]: I0321 04:51:39.990197 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f12ee9-da5c-44bf-aa45-02640350f0ea" containerName="registry-server" Mar 21 04:51:39 crc kubenswrapper[4839]: I0321 04:51:39.991832 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.006892 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f4lbn"] Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.145177 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zwfl\" (UniqueName: \"kubernetes.io/projected/913aacec-84de-44a6-98fb-382c04095d62-kube-api-access-8zwfl\") pod \"certified-operators-f4lbn\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.145279 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-catalog-content\") pod \"certified-operators-f4lbn\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.145322 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-utilities\") pod \"certified-operators-f4lbn\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.247320 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zwfl\" (UniqueName: \"kubernetes.io/projected/913aacec-84de-44a6-98fb-382c04095d62-kube-api-access-8zwfl\") pod \"certified-operators-f4lbn\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.247467 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-catalog-content\") pod \"certified-operators-f4lbn\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.247523 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-utilities\") pod \"certified-operators-f4lbn\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.248003 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-catalog-content\") pod \"certified-operators-f4lbn\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.248048 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-utilities\") pod \"certified-operators-f4lbn\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.275800 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zwfl\" (UniqueName: \"kubernetes.io/projected/913aacec-84de-44a6-98fb-382c04095d62-kube-api-access-8zwfl\") pod \"certified-operators-f4lbn\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.312664 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:40 crc kubenswrapper[4839]: I0321 04:51:40.806336 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f4lbn"] Mar 21 04:51:41 crc kubenswrapper[4839]: I0321 04:51:41.233673 4839 generic.go:334] "Generic (PLEG): container finished" podID="913aacec-84de-44a6-98fb-382c04095d62" containerID="a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d" exitCode=0 Mar 21 04:51:41 crc kubenswrapper[4839]: I0321 04:51:41.233721 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4lbn" event={"ID":"913aacec-84de-44a6-98fb-382c04095d62","Type":"ContainerDied","Data":"a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d"} Mar 21 04:51:41 crc kubenswrapper[4839]: I0321 04:51:41.233751 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4lbn" event={"ID":"913aacec-84de-44a6-98fb-382c04095d62","Type":"ContainerStarted","Data":"ad270626a3212620dcc08dc79d71191ac5bcf08c5e916533298a0ab1ed1c26c5"} Mar 21 04:51:43 crc kubenswrapper[4839]: I0321 04:51:43.256085 4839 generic.go:334] "Generic (PLEG): container finished" podID="913aacec-84de-44a6-98fb-382c04095d62" containerID="dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6" exitCode=0 Mar 21 04:51:43 crc kubenswrapper[4839]: I0321 04:51:43.256147 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4lbn" event={"ID":"913aacec-84de-44a6-98fb-382c04095d62","Type":"ContainerDied","Data":"dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6"} Mar 21 04:51:43 crc kubenswrapper[4839]: E0321 04:51:43.344474 4839 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod913aacec_84de_44a6_98fb_382c04095d62.slice/crio-dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:51:45 crc kubenswrapper[4839]: I0321 04:51:45.274185 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4lbn" event={"ID":"913aacec-84de-44a6-98fb-382c04095d62","Type":"ContainerStarted","Data":"476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8"} Mar 21 04:51:45 crc kubenswrapper[4839]: I0321 04:51:45.295444 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f4lbn" podStartSLOduration=3.251086168 podStartE2EDuration="6.295424905s" podCreationTimestamp="2026-03-21 04:51:39 +0000 UTC" firstStartedPulling="2026-03-21 04:51:41.235319107 +0000 UTC m=+1705.563105803" lastFinishedPulling="2026-03-21 04:51:44.279657874 +0000 UTC m=+1708.607444540" observedRunningTime="2026-03-21 04:51:45.29274708 +0000 UTC m=+1709.620533786" watchObservedRunningTime="2026-03-21 04:51:45.295424905 +0000 UTC m=+1709.623211581" Mar 21 04:51:50 crc kubenswrapper[4839]: I0321 04:51:50.312950 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:50 crc kubenswrapper[4839]: I0321 04:51:50.313504 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:50 crc kubenswrapper[4839]: I0321 04:51:50.358130 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:51 crc kubenswrapper[4839]: I0321 04:51:51.368390 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:51 crc kubenswrapper[4839]: I0321 04:51:51.417127 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f4lbn"] Mar 21 04:51:53 crc kubenswrapper[4839]: I0321 04:51:53.344305 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f4lbn" podUID="913aacec-84de-44a6-98fb-382c04095d62" containerName="registry-server" containerID="cri-o://476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8" gracePeriod=2 Mar 21 04:51:53 crc kubenswrapper[4839]: E0321 04:51:53.574982 4839 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod913aacec_84de_44a6_98fb_382c04095d62.slice/crio-conmon-476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:51:53 crc kubenswrapper[4839]: I0321 04:51:53.815493 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:53 crc kubenswrapper[4839]: I0321 04:51:53.903112 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-catalog-content\") pod \"913aacec-84de-44a6-98fb-382c04095d62\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " Mar 21 04:51:53 crc kubenswrapper[4839]: I0321 04:51:53.903254 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-utilities\") pod \"913aacec-84de-44a6-98fb-382c04095d62\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " Mar 21 04:51:53 crc kubenswrapper[4839]: I0321 04:51:53.903400 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zwfl\" (UniqueName: \"kubernetes.io/projected/913aacec-84de-44a6-98fb-382c04095d62-kube-api-access-8zwfl\") pod \"913aacec-84de-44a6-98fb-382c04095d62\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " Mar 21 04:51:53 crc kubenswrapper[4839]: I0321 04:51:53.905430 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-utilities" (OuterVolumeSpecName: "utilities") pod "913aacec-84de-44a6-98fb-382c04095d62" (UID: "913aacec-84de-44a6-98fb-382c04095d62"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:51:53 crc kubenswrapper[4839]: I0321 04:51:53.911601 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/913aacec-84de-44a6-98fb-382c04095d62-kube-api-access-8zwfl" (OuterVolumeSpecName: "kube-api-access-8zwfl") pod "913aacec-84de-44a6-98fb-382c04095d62" (UID: "913aacec-84de-44a6-98fb-382c04095d62"). InnerVolumeSpecName "kube-api-access-8zwfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.005303 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.005349 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zwfl\" (UniqueName: \"kubernetes.io/projected/913aacec-84de-44a6-98fb-382c04095d62-kube-api-access-8zwfl\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.353651 4839 generic.go:334] "Generic (PLEG): container finished" podID="913aacec-84de-44a6-98fb-382c04095d62" containerID="476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8" exitCode=0 Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.353730 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f4lbn" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.353737 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4lbn" event={"ID":"913aacec-84de-44a6-98fb-382c04095d62","Type":"ContainerDied","Data":"476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8"} Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.354712 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f4lbn" event={"ID":"913aacec-84de-44a6-98fb-382c04095d62","Type":"ContainerDied","Data":"ad270626a3212620dcc08dc79d71191ac5bcf08c5e916533298a0ab1ed1c26c5"} Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.354735 4839 scope.go:117] "RemoveContainer" containerID="476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.373020 4839 scope.go:117] "RemoveContainer" containerID="dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.394388 4839 scope.go:117] "RemoveContainer" containerID="a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.436646 4839 scope.go:117] "RemoveContainer" containerID="476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8" Mar 21 04:51:54 crc kubenswrapper[4839]: E0321 04:51:54.437220 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8\": container with ID starting with 476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8 not found: ID does not exist" containerID="476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.437252 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8"} err="failed to get container status \"476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8\": rpc error: code = NotFound desc = could not find container \"476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8\": container with ID starting with 476bd11c97b3916a913e9cca1183718468b802b377c417e1b79007f9917876c8 not found: ID does not exist" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.437274 4839 scope.go:117] "RemoveContainer" containerID="dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6" Mar 21 04:51:54 crc kubenswrapper[4839]: E0321 04:51:54.437693 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6\": container with ID starting with dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6 not found: ID does not exist" containerID="dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.437742 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6"} err="failed to get container status \"dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6\": rpc error: code = NotFound desc = could not find container \"dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6\": container with ID starting with dab44ad0cfa8219f7ebc4e43b577d42e27b0c303e738076c7a29783d924e0ab6 not found: ID does not exist" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.437770 4839 scope.go:117] "RemoveContainer" containerID="a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d" Mar 21 04:51:54 crc kubenswrapper[4839]: E0321 04:51:54.438049 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d\": container with ID starting with a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d not found: ID does not exist" containerID="a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d" Mar 21 04:51:54 crc kubenswrapper[4839]: I0321 04:51:54.438073 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d"} err="failed to get container status \"a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d\": rpc error: code = NotFound desc = could not find container \"a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d\": container with ID starting with a746950b3578c4d87cac3144ec92f3da634b4a5d684a79031e78585a50a57c8d not found: ID does not exist" Mar 21 04:51:55 crc kubenswrapper[4839]: I0321 04:51:55.944783 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "913aacec-84de-44a6-98fb-382c04095d62" (UID: "913aacec-84de-44a6-98fb-382c04095d62"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:51:55 crc kubenswrapper[4839]: I0321 04:51:55.945453 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-catalog-content\") pod \"913aacec-84de-44a6-98fb-382c04095d62\" (UID: \"913aacec-84de-44a6-98fb-382c04095d62\") " Mar 21 04:51:55 crc kubenswrapper[4839]: W0321 04:51:55.945556 4839 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/913aacec-84de-44a6-98fb-382c04095d62/volumes/kubernetes.io~empty-dir/catalog-content Mar 21 04:51:55 crc kubenswrapper[4839]: I0321 04:51:55.945627 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "913aacec-84de-44a6-98fb-382c04095d62" (UID: "913aacec-84de-44a6-98fb-382c04095d62"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:51:55 crc kubenswrapper[4839]: I0321 04:51:55.946072 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/913aacec-84de-44a6-98fb-382c04095d62-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:51:56 crc kubenswrapper[4839]: I0321 04:51:56.199273 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f4lbn"] Mar 21 04:51:56 crc kubenswrapper[4839]: I0321 04:51:56.207052 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f4lbn"] Mar 21 04:51:56 crc kubenswrapper[4839]: I0321 04:51:56.464082 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="913aacec-84de-44a6-98fb-382c04095d62" path="/var/lib/kubelet/pods/913aacec-84de-44a6-98fb-382c04095d62/volumes" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.144783 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567812-jglhv"] Mar 21 04:52:00 crc kubenswrapper[4839]: E0321 04:52:00.145888 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="913aacec-84de-44a6-98fb-382c04095d62" containerName="registry-server" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.145906 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="913aacec-84de-44a6-98fb-382c04095d62" containerName="registry-server" Mar 21 04:52:00 crc kubenswrapper[4839]: E0321 04:52:00.145960 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="913aacec-84de-44a6-98fb-382c04095d62" containerName="extract-utilities" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.145970 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="913aacec-84de-44a6-98fb-382c04095d62" containerName="extract-utilities" Mar 21 04:52:00 crc kubenswrapper[4839]: E0321 04:52:00.145984 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="913aacec-84de-44a6-98fb-382c04095d62" containerName="extract-content" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.145993 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="913aacec-84de-44a6-98fb-382c04095d62" containerName="extract-content" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.146209 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="913aacec-84de-44a6-98fb-382c04095d62" containerName="registry-server" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.146895 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567812-jglhv" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.150378 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.150624 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.150763 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.156593 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567812-jglhv"] Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.325903 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f88r4\" (UniqueName: \"kubernetes.io/projected/d13124dd-cca5-49f6-9638-2cb42ed2bb34-kube-api-access-f88r4\") pod \"auto-csr-approver-29567812-jglhv\" (UID: \"d13124dd-cca5-49f6-9638-2cb42ed2bb34\") " pod="openshift-infra/auto-csr-approver-29567812-jglhv" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.427780 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f88r4\" (UniqueName: \"kubernetes.io/projected/d13124dd-cca5-49f6-9638-2cb42ed2bb34-kube-api-access-f88r4\") pod \"auto-csr-approver-29567812-jglhv\" (UID: \"d13124dd-cca5-49f6-9638-2cb42ed2bb34\") " pod="openshift-infra/auto-csr-approver-29567812-jglhv" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.450135 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f88r4\" (UniqueName: \"kubernetes.io/projected/d13124dd-cca5-49f6-9638-2cb42ed2bb34-kube-api-access-f88r4\") pod \"auto-csr-approver-29567812-jglhv\" (UID: \"d13124dd-cca5-49f6-9638-2cb42ed2bb34\") " pod="openshift-infra/auto-csr-approver-29567812-jglhv" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.475599 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567812-jglhv" Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.969424 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567812-jglhv"] Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.979731 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:52:00 crc kubenswrapper[4839]: I0321 04:52:00.979785 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:52:01 crc kubenswrapper[4839]: I0321 04:52:01.415721 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567812-jglhv" event={"ID":"d13124dd-cca5-49f6-9638-2cb42ed2bb34","Type":"ContainerStarted","Data":"f4b992763e87881a8f6e9f9c3cac8b18f69942c9aec4b03795a6769871a27094"} Mar 21 04:52:02 crc kubenswrapper[4839]: I0321 04:52:02.426456 4839 generic.go:334] "Generic (PLEG): container finished" podID="d13124dd-cca5-49f6-9638-2cb42ed2bb34" containerID="5189f213ccdcf6760a09eb930ee4482a2d44b649489c1422b2b1e4b3849ef663" exitCode=0 Mar 21 04:52:02 crc kubenswrapper[4839]: I0321 04:52:02.426617 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567812-jglhv" event={"ID":"d13124dd-cca5-49f6-9638-2cb42ed2bb34","Type":"ContainerDied","Data":"5189f213ccdcf6760a09eb930ee4482a2d44b649489c1422b2b1e4b3849ef663"} Mar 21 04:52:03 crc kubenswrapper[4839]: I0321 04:52:03.746121 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567812-jglhv" Mar 21 04:52:03 crc kubenswrapper[4839]: I0321 04:52:03.894174 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f88r4\" (UniqueName: \"kubernetes.io/projected/d13124dd-cca5-49f6-9638-2cb42ed2bb34-kube-api-access-f88r4\") pod \"d13124dd-cca5-49f6-9638-2cb42ed2bb34\" (UID: \"d13124dd-cca5-49f6-9638-2cb42ed2bb34\") " Mar 21 04:52:03 crc kubenswrapper[4839]: I0321 04:52:03.905044 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13124dd-cca5-49f6-9638-2cb42ed2bb34-kube-api-access-f88r4" (OuterVolumeSpecName: "kube-api-access-f88r4") pod "d13124dd-cca5-49f6-9638-2cb42ed2bb34" (UID: "d13124dd-cca5-49f6-9638-2cb42ed2bb34"). InnerVolumeSpecName "kube-api-access-f88r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:52:03 crc kubenswrapper[4839]: I0321 04:52:03.998196 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f88r4\" (UniqueName: \"kubernetes.io/projected/d13124dd-cca5-49f6-9638-2cb42ed2bb34-kube-api-access-f88r4\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:04 crc kubenswrapper[4839]: I0321 04:52:04.446008 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567812-jglhv" event={"ID":"d13124dd-cca5-49f6-9638-2cb42ed2bb34","Type":"ContainerDied","Data":"f4b992763e87881a8f6e9f9c3cac8b18f69942c9aec4b03795a6769871a27094"} Mar 21 04:52:04 crc kubenswrapper[4839]: I0321 04:52:04.446071 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567812-jglhv" Mar 21 04:52:04 crc kubenswrapper[4839]: I0321 04:52:04.446086 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4b992763e87881a8f6e9f9c3cac8b18f69942c9aec4b03795a6769871a27094" Mar 21 04:52:04 crc kubenswrapper[4839]: I0321 04:52:04.841687 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567806-g4rcl"] Mar 21 04:52:04 crc kubenswrapper[4839]: I0321 04:52:04.854206 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567806-g4rcl"] Mar 21 04:52:06 crc kubenswrapper[4839]: I0321 04:52:06.464488 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75c1454e-0aed-48d9-a0f2-f7c2797156ce" path="/var/lib/kubelet/pods/75c1454e-0aed-48d9-a0f2-f7c2797156ce/volumes" Mar 21 04:52:10 crc kubenswrapper[4839]: I0321 04:52:10.497396 4839 generic.go:334] "Generic (PLEG): container finished" podID="a1d76458-d587-4960-9bcc-7e3d3122b44d" containerID="70456d0f0e0073bde0ceeec7013aa756cec85385ed747be87d7d96cfa8d04986" exitCode=0 Mar 21 04:52:10 crc kubenswrapper[4839]: I0321 04:52:10.497487 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" event={"ID":"a1d76458-d587-4960-9bcc-7e3d3122b44d","Type":"ContainerDied","Data":"70456d0f0e0073bde0ceeec7013aa756cec85385ed747be87d7d96cfa8d04986"} Mar 21 04:52:11 crc kubenswrapper[4839]: I0321 04:52:11.889061 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.050288 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-inventory\") pod \"a1d76458-d587-4960-9bcc-7e3d3122b44d\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.050376 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjf5g\" (UniqueName: \"kubernetes.io/projected/a1d76458-d587-4960-9bcc-7e3d3122b44d-kube-api-access-zjf5g\") pod \"a1d76458-d587-4960-9bcc-7e3d3122b44d\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.050442 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-bootstrap-combined-ca-bundle\") pod \"a1d76458-d587-4960-9bcc-7e3d3122b44d\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.050508 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-ssh-key-openstack-edpm-ipam\") pod \"a1d76458-d587-4960-9bcc-7e3d3122b44d\" (UID: \"a1d76458-d587-4960-9bcc-7e3d3122b44d\") " Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.058705 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1d76458-d587-4960-9bcc-7e3d3122b44d-kube-api-access-zjf5g" (OuterVolumeSpecName: "kube-api-access-zjf5g") pod "a1d76458-d587-4960-9bcc-7e3d3122b44d" (UID: "a1d76458-d587-4960-9bcc-7e3d3122b44d"). InnerVolumeSpecName "kube-api-access-zjf5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.063716 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a1d76458-d587-4960-9bcc-7e3d3122b44d" (UID: "a1d76458-d587-4960-9bcc-7e3d3122b44d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.081382 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a1d76458-d587-4960-9bcc-7e3d3122b44d" (UID: "a1d76458-d587-4960-9bcc-7e3d3122b44d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.083774 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-inventory" (OuterVolumeSpecName: "inventory") pod "a1d76458-d587-4960-9bcc-7e3d3122b44d" (UID: "a1d76458-d587-4960-9bcc-7e3d3122b44d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.153498 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.153545 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjf5g\" (UniqueName: \"kubernetes.io/projected/a1d76458-d587-4960-9bcc-7e3d3122b44d-kube-api-access-zjf5g\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.153560 4839 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.153582 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1d76458-d587-4960-9bcc-7e3d3122b44d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.518963 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" event={"ID":"a1d76458-d587-4960-9bcc-7e3d3122b44d","Type":"ContainerDied","Data":"822a23cdb1c22865313c8050c3f022e1750a4731cc96eac72c126accbbe28877"} Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.519278 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="822a23cdb1c22865313c8050c3f022e1750a4731cc96eac72c126accbbe28877" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.519013 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.600367 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt"] Mar 21 04:52:12 crc kubenswrapper[4839]: E0321 04:52:12.600959 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1d76458-d587-4960-9bcc-7e3d3122b44d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.600985 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d76458-d587-4960-9bcc-7e3d3122b44d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 21 04:52:12 crc kubenswrapper[4839]: E0321 04:52:12.601024 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13124dd-cca5-49f6-9638-2cb42ed2bb34" containerName="oc" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.601034 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13124dd-cca5-49f6-9638-2cb42ed2bb34" containerName="oc" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.601262 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1d76458-d587-4960-9bcc-7e3d3122b44d" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.601300 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d13124dd-cca5-49f6-9638-2cb42ed2bb34" containerName="oc" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.602076 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.612164 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt"] Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.613010 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.613127 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.613426 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.613488 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.771593 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.771745 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.771849 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vstkz\" (UniqueName: \"kubernetes.io/projected/7f875f01-020a-4cd6-950a-4dbb6ccb344e-kube-api-access-vstkz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.873526 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.873658 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.873740 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vstkz\" (UniqueName: \"kubernetes.io/projected/7f875f01-020a-4cd6-950a-4dbb6ccb344e-kube-api-access-vstkz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.879303 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.881989 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.894923 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vstkz\" (UniqueName: \"kubernetes.io/projected/7f875f01-020a-4cd6-950a-4dbb6ccb344e-kube-api-access-vstkz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:12 crc kubenswrapper[4839]: I0321 04:52:12.929418 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:52:13 crc kubenswrapper[4839]: I0321 04:52:13.477724 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt"] Mar 21 04:52:13 crc kubenswrapper[4839]: I0321 04:52:13.532147 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" event={"ID":"7f875f01-020a-4cd6-950a-4dbb6ccb344e","Type":"ContainerStarted","Data":"c285fa2b3b39fe801417eab93d8e0dfdfe9b49b82a93f8d3496b2dd46f1f1041"} Mar 21 04:52:14 crc kubenswrapper[4839]: I0321 04:52:14.541727 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" event={"ID":"7f875f01-020a-4cd6-950a-4dbb6ccb344e","Type":"ContainerStarted","Data":"10146a412661c985917c18611c191561c13c12f150267b88fe5fad51b5ee448c"} Mar 21 04:52:14 crc kubenswrapper[4839]: I0321 04:52:14.570887 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" podStartSLOduration=2.055597203 podStartE2EDuration="2.570865011s" podCreationTimestamp="2026-03-21 04:52:12 +0000 UTC" firstStartedPulling="2026-03-21 04:52:13.498744088 +0000 UTC m=+1737.826530764" lastFinishedPulling="2026-03-21 04:52:14.014011906 +0000 UTC m=+1738.341798572" observedRunningTime="2026-03-21 04:52:14.564084141 +0000 UTC m=+1738.891870837" watchObservedRunningTime="2026-03-21 04:52:14.570865011 +0000 UTC m=+1738.898651687" Mar 21 04:52:30 crc kubenswrapper[4839]: I0321 04:52:30.980730 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:52:30 crc kubenswrapper[4839]: I0321 04:52:30.981153 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:52:53 crc kubenswrapper[4839]: I0321 04:52:53.674734 4839 scope.go:117] "RemoveContainer" containerID="66cb92ff47a88ccd93ffde6b9853588d4c4d5f3a25eb2c7a9862fbe6f8dc60f8" Mar 21 04:53:00 crc kubenswrapper[4839]: I0321 04:53:00.980441 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 04:53:00 crc kubenswrapper[4839]: I0321 04:53:00.980963 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 04:53:00 crc kubenswrapper[4839]: I0321 04:53:00.981006 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 04:53:00 crc kubenswrapper[4839]: I0321 04:53:00.981737 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 04:53:00 crc kubenswrapper[4839]: I0321 04:53:00.981793 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" gracePeriod=600 Mar 21 04:53:01 crc kubenswrapper[4839]: E0321 04:53:01.107798 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:53:01 crc kubenswrapper[4839]: I0321 04:53:01.967077 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" exitCode=0 Mar 21 04:53:01 crc kubenswrapper[4839]: I0321 04:53:01.967141 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109"} Mar 21 04:53:01 crc kubenswrapper[4839]: I0321 04:53:01.967441 4839 scope.go:117] "RemoveContainer" containerID="c031ed8f7b7576f57e9530a46687f2f2de2e5c2a62f42435eef393cfd7af2b37" Mar 21 04:53:01 crc kubenswrapper[4839]: I0321 04:53:01.968108 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:53:01 crc kubenswrapper[4839]: E0321 04:53:01.968430 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:53:16 crc kubenswrapper[4839]: I0321 04:53:16.458738 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:53:16 crc kubenswrapper[4839]: E0321 04:53:16.459501 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:53:20 crc kubenswrapper[4839]: I0321 04:53:20.049964 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-1eec-account-create-update-h7hp7"] Mar 21 04:53:20 crc kubenswrapper[4839]: I0321 04:53:20.062280 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-v4k9c"] Mar 21 04:53:20 crc kubenswrapper[4839]: I0321 04:53:20.073997 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-852d-account-create-update-nv5n7"] Mar 21 04:53:20 crc kubenswrapper[4839]: I0321 04:53:20.081870 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-v4k9c"] Mar 21 04:53:20 crc kubenswrapper[4839]: I0321 04:53:20.089872 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-852d-account-create-update-nv5n7"] Mar 21 04:53:20 crc kubenswrapper[4839]: I0321 04:53:20.100483 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-1eec-account-create-update-h7hp7"] Mar 21 04:53:20 crc kubenswrapper[4839]: I0321 04:53:20.465682 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59ce13a7-d2a6-4c54-908d-39d1511da50b" path="/var/lib/kubelet/pods/59ce13a7-d2a6-4c54-908d-39d1511da50b/volumes" Mar 21 04:53:20 crc kubenswrapper[4839]: I0321 04:53:20.466821 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ad3cc08-174a-4164-aa38-3d7f6fbed0c0" path="/var/lib/kubelet/pods/8ad3cc08-174a-4164-aa38-3d7f6fbed0c0/volumes" Mar 21 04:53:20 crc kubenswrapper[4839]: I0321 04:53:20.467893 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e779c2ff-ee70-4779-b3fc-3b3bf87aff47" path="/var/lib/kubelet/pods/e779c2ff-ee70-4779-b3fc-3b3bf87aff47/volumes" Mar 21 04:53:21 crc kubenswrapper[4839]: I0321 04:53:21.033206 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-tnx95"] Mar 21 04:53:21 crc kubenswrapper[4839]: I0321 04:53:21.041306 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-tnx95"] Mar 21 04:53:22 crc kubenswrapper[4839]: I0321 04:53:22.463808 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9a6840a-2ece-4b8d-be60-caa89912db9f" path="/var/lib/kubelet/pods/c9a6840a-2ece-4b8d-be60-caa89912db9f/volumes" Mar 21 04:53:24 crc kubenswrapper[4839]: I0321 04:53:24.045202 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-31f4-account-create-update-98c9m"] Mar 21 04:53:24 crc kubenswrapper[4839]: I0321 04:53:24.053973 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-vqfbm"] Mar 21 04:53:24 crc kubenswrapper[4839]: I0321 04:53:24.062458 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-31f4-account-create-update-98c9m"] Mar 21 04:53:24 crc kubenswrapper[4839]: I0321 04:53:24.070039 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-vqfbm"] Mar 21 04:53:24 crc kubenswrapper[4839]: I0321 04:53:24.463257 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5740bec9-4b0c-4092-8309-14fdb2562c2e" path="/var/lib/kubelet/pods/5740bec9-4b0c-4092-8309-14fdb2562c2e/volumes" Mar 21 04:53:24 crc kubenswrapper[4839]: I0321 04:53:24.464434 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b46c59d5-1b87-471e-ae9b-b8ba7ca8d754" path="/var/lib/kubelet/pods/b46c59d5-1b87-471e-ae9b-b8ba7ca8d754/volumes" Mar 21 04:53:31 crc kubenswrapper[4839]: I0321 04:53:31.452968 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:53:31 crc kubenswrapper[4839]: E0321 04:53:31.453409 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:53:39 crc kubenswrapper[4839]: I0321 04:53:39.310930 4839 generic.go:334] "Generic (PLEG): container finished" podID="7f875f01-020a-4cd6-950a-4dbb6ccb344e" containerID="10146a412661c985917c18611c191561c13c12f150267b88fe5fad51b5ee448c" exitCode=0 Mar 21 04:53:39 crc kubenswrapper[4839]: I0321 04:53:39.311138 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" event={"ID":"7f875f01-020a-4cd6-950a-4dbb6ccb344e","Type":"ContainerDied","Data":"10146a412661c985917c18611c191561c13c12f150267b88fe5fad51b5ee448c"} Mar 21 04:53:40 crc kubenswrapper[4839]: I0321 04:53:40.720039 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:53:40 crc kubenswrapper[4839]: I0321 04:53:40.739600 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-ssh-key-openstack-edpm-ipam\") pod \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " Mar 21 04:53:40 crc kubenswrapper[4839]: I0321 04:53:40.739651 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vstkz\" (UniqueName: \"kubernetes.io/projected/7f875f01-020a-4cd6-950a-4dbb6ccb344e-kube-api-access-vstkz\") pod \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " Mar 21 04:53:40 crc kubenswrapper[4839]: I0321 04:53:40.739689 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-inventory\") pod \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\" (UID: \"7f875f01-020a-4cd6-950a-4dbb6ccb344e\") " Mar 21 04:53:40 crc kubenswrapper[4839]: I0321 04:53:40.764913 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f875f01-020a-4cd6-950a-4dbb6ccb344e-kube-api-access-vstkz" (OuterVolumeSpecName: "kube-api-access-vstkz") pod "7f875f01-020a-4cd6-950a-4dbb6ccb344e" (UID: "7f875f01-020a-4cd6-950a-4dbb6ccb344e"). InnerVolumeSpecName "kube-api-access-vstkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:53:40 crc kubenswrapper[4839]: I0321 04:53:40.774505 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7f875f01-020a-4cd6-950a-4dbb6ccb344e" (UID: "7f875f01-020a-4cd6-950a-4dbb6ccb344e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:40 crc kubenswrapper[4839]: I0321 04:53:40.797416 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-inventory" (OuterVolumeSpecName: "inventory") pod "7f875f01-020a-4cd6-950a-4dbb6ccb344e" (UID: "7f875f01-020a-4cd6-950a-4dbb6ccb344e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:53:40 crc kubenswrapper[4839]: I0321 04:53:40.841965 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:40 crc kubenswrapper[4839]: I0321 04:53:40.842020 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vstkz\" (UniqueName: \"kubernetes.io/projected/7f875f01-020a-4cd6-950a-4dbb6ccb344e-kube-api-access-vstkz\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:40 crc kubenswrapper[4839]: I0321 04:53:40.842035 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7f875f01-020a-4cd6-950a-4dbb6ccb344e-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.356303 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" event={"ID":"7f875f01-020a-4cd6-950a-4dbb6ccb344e","Type":"ContainerDied","Data":"c285fa2b3b39fe801417eab93d8e0dfdfe9b49b82a93f8d3496b2dd46f1f1041"} Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.356985 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c285fa2b3b39fe801417eab93d8e0dfdfe9b49b82a93f8d3496b2dd46f1f1041" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.356794 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.439136 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx"] Mar 21 04:53:41 crc kubenswrapper[4839]: E0321 04:53:41.439680 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f875f01-020a-4cd6-950a-4dbb6ccb344e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.439702 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f875f01-020a-4cd6-950a-4dbb6ccb344e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.439972 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f875f01-020a-4cd6-950a-4dbb6ccb344e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.440763 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.445414 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.445531 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.445677 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.446255 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.460419 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx"] Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.460507 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hxd9\" (UniqueName: \"kubernetes.io/projected/a58d82e4-2de9-4680-a08c-6eeb775ed08a-kube-api-access-9hxd9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.460616 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.460874 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.562384 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hxd9\" (UniqueName: \"kubernetes.io/projected/a58d82e4-2de9-4680-a08c-6eeb775ed08a-kube-api-access-9hxd9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.562441 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.562539 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.567364 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.577143 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.587680 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hxd9\" (UniqueName: \"kubernetes.io/projected/a58d82e4-2de9-4680-a08c-6eeb775ed08a-kube-api-access-9hxd9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:41 crc kubenswrapper[4839]: I0321 04:53:41.772796 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:53:42 crc kubenswrapper[4839]: I0321 04:53:42.314216 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx"] Mar 21 04:53:42 crc kubenswrapper[4839]: I0321 04:53:42.366231 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" event={"ID":"a58d82e4-2de9-4680-a08c-6eeb775ed08a","Type":"ContainerStarted","Data":"ef1d592a6aae72b1b9bb235f77d743e0bad4065ccda22b837001cbeb2d26cd16"} Mar 21 04:53:43 crc kubenswrapper[4839]: I0321 04:53:43.378799 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" event={"ID":"a58d82e4-2de9-4680-a08c-6eeb775ed08a","Type":"ContainerStarted","Data":"2ff51247bd02a82c38abca9599a8bd0159eda3e65cc9e732eaaf1569f9b29e29"} Mar 21 04:53:43 crc kubenswrapper[4839]: I0321 04:53:43.413505 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" podStartSLOduration=1.793650655 podStartE2EDuration="2.413486108s" podCreationTimestamp="2026-03-21 04:53:41 +0000 UTC" firstStartedPulling="2026-03-21 04:53:42.314864541 +0000 UTC m=+1826.642651207" lastFinishedPulling="2026-03-21 04:53:42.934699984 +0000 UTC m=+1827.262486660" observedRunningTime="2026-03-21 04:53:43.398676742 +0000 UTC m=+1827.726463438" watchObservedRunningTime="2026-03-21 04:53:43.413486108 +0000 UTC m=+1827.741272784" Mar 21 04:53:43 crc kubenswrapper[4839]: I0321 04:53:43.453849 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:53:43 crc kubenswrapper[4839]: E0321 04:53:43.454211 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.057703 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-4d6f-account-create-update-st2sv"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.074384 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-4d6f-account-create-update-st2sv"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.096878 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-79rjr"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.107007 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-79rjr"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.124842 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3d03-account-create-update-q9rgd"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.136262 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-h5448"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.147221 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-a5f8-account-create-update-2srvb"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.158044 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3d03-account-create-update-q9rgd"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.168928 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-h5448"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.180102 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-nv7qf"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.191500 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-a5f8-account-create-update-2srvb"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.203094 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-nv7qf"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.213154 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-xc9zf"] Mar 21 04:53:47 crc kubenswrapper[4839]: I0321 04:53:47.222451 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-xc9zf"] Mar 21 04:53:48 crc kubenswrapper[4839]: I0321 04:53:48.464172 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34a240db-9587-446e-af12-a44b87b1a3ac" path="/var/lib/kubelet/pods/34a240db-9587-446e-af12-a44b87b1a3ac/volumes" Mar 21 04:53:48 crc kubenswrapper[4839]: I0321 04:53:48.464742 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c8ad856-1b19-4b1c-8124-2e316dd567ee" path="/var/lib/kubelet/pods/8c8ad856-1b19-4b1c-8124-2e316dd567ee/volumes" Mar 21 04:53:48 crc kubenswrapper[4839]: I0321 04:53:48.465243 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d" path="/var/lib/kubelet/pods/9a5cee9b-67b3-40b1-bc62-e6a3c4c1272d/volumes" Mar 21 04:53:48 crc kubenswrapper[4839]: I0321 04:53:48.465843 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7dfdbcf-7830-4f8d-a165-119fe80d999a" path="/var/lib/kubelet/pods/a7dfdbcf-7830-4f8d-a165-119fe80d999a/volumes" Mar 21 04:53:48 crc kubenswrapper[4839]: I0321 04:53:48.466860 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae4cf7f5-74ed-45d7-ace7-24ada744db6c" path="/var/lib/kubelet/pods/ae4cf7f5-74ed-45d7-ace7-24ada744db6c/volumes" Mar 21 04:53:48 crc kubenswrapper[4839]: I0321 04:53:48.467397 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b567c69c-d110-4ab2-aaf7-da82f0e72cc3" path="/var/lib/kubelet/pods/b567c69c-d110-4ab2-aaf7-da82f0e72cc3/volumes" Mar 21 04:53:48 crc kubenswrapper[4839]: I0321 04:53:48.467926 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f816daf8-a9c7-4e99-a622-2f9bee7d203a" path="/var/lib/kubelet/pods/f816daf8-a9c7-4e99-a622-2f9bee7d203a/volumes" Mar 21 04:53:53 crc kubenswrapper[4839]: I0321 04:53:53.774499 4839 scope.go:117] "RemoveContainer" containerID="032f3b05c1ff562800fafe59fd0384b7d678921d7fe7e90157dab690dc2e9894" Mar 21 04:53:53 crc kubenswrapper[4839]: I0321 04:53:53.808685 4839 scope.go:117] "RemoveContainer" containerID="048e9e28ac07a1e9124d69a89e17059f1d443023c6faf3348223cf9a7387e352" Mar 21 04:53:53 crc kubenswrapper[4839]: I0321 04:53:53.841010 4839 scope.go:117] "RemoveContainer" containerID="3a57c10b2c78e441d84cbfab2416b69cf3571cc562b77b3b3134c8875131a599" Mar 21 04:53:53 crc kubenswrapper[4839]: I0321 04:53:53.880224 4839 scope.go:117] "RemoveContainer" containerID="1fd2a9e659b1f417a5acc26d40481b58d37731bc164379bcd010cc11a61ef9ec" Mar 21 04:53:53 crc kubenswrapper[4839]: I0321 04:53:53.920239 4839 scope.go:117] "RemoveContainer" containerID="18ee77a1f0c351aba88f15dc3bad4a37015f55b27e92d2f7b43fdfe709bc67ef" Mar 21 04:53:53 crc kubenswrapper[4839]: I0321 04:53:53.960357 4839 scope.go:117] "RemoveContainer" containerID="e304597468db9fac443bca06d530c54513659f708975616b601e678f7766dbe4" Mar 21 04:53:54 crc kubenswrapper[4839]: I0321 04:53:54.010486 4839 scope.go:117] "RemoveContainer" containerID="1c106f22b4e401c674f904200a929e5e68e3e4f4a62178a136c50ceb882cf719" Mar 21 04:53:54 crc kubenswrapper[4839]: I0321 04:53:54.039973 4839 scope.go:117] "RemoveContainer" containerID="bc85e819a8b1f2def449cfd0987dc3cf3c1805c923545af4ec58edeba1a10775" Mar 21 04:53:54 crc kubenswrapper[4839]: I0321 04:53:54.058611 4839 scope.go:117] "RemoveContainer" containerID="ebddf9a5729dde6feb4416ede20f92a3dd052bc816ed0593e001a7eb65da5807" Mar 21 04:53:54 crc kubenswrapper[4839]: I0321 04:53:54.079870 4839 scope.go:117] "RemoveContainer" containerID="1fc9b78e56e247468e98f90edfea187e15daf4ea152975c90e4d68c87986ba79" Mar 21 04:53:54 crc kubenswrapper[4839]: I0321 04:53:54.101392 4839 scope.go:117] "RemoveContainer" containerID="435dd7b699a596fb94e68ae9d7689a76011012e1ee2be4e567ac5a478d536eb6" Mar 21 04:53:54 crc kubenswrapper[4839]: I0321 04:53:54.125118 4839 scope.go:117] "RemoveContainer" containerID="3b8d2c5a2c9686ff0f867f32b67ae1a47eca812fdcbd3b93adee22c245151532" Mar 21 04:53:54 crc kubenswrapper[4839]: I0321 04:53:54.143350 4839 scope.go:117] "RemoveContainer" containerID="e6c4b76ad2e0d2ae69413d1d9b61feffab9768c0ce8180f11f0b591ba10e6f2c" Mar 21 04:53:54 crc kubenswrapper[4839]: I0321 04:53:54.453097 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:53:54 crc kubenswrapper[4839]: E0321 04:53:54.453400 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:53:58 crc kubenswrapper[4839]: I0321 04:53:58.050104 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-qgdlf"] Mar 21 04:53:58 crc kubenswrapper[4839]: I0321 04:53:58.059904 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-qgdlf"] Mar 21 04:53:58 crc kubenswrapper[4839]: I0321 04:53:58.462473 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc21c34c-13c1-4733-9013-0cfd304b179c" path="/var/lib/kubelet/pods/bc21c34c-13c1-4733-9013-0cfd304b179c/volumes" Mar 21 04:54:00 crc kubenswrapper[4839]: I0321 04:54:00.143830 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567814-q8zxw"] Mar 21 04:54:00 crc kubenswrapper[4839]: I0321 04:54:00.145027 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567814-q8zxw" Mar 21 04:54:00 crc kubenswrapper[4839]: I0321 04:54:00.147109 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:54:00 crc kubenswrapper[4839]: I0321 04:54:00.147532 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:54:00 crc kubenswrapper[4839]: I0321 04:54:00.148181 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:54:00 crc kubenswrapper[4839]: I0321 04:54:00.152543 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567814-q8zxw"] Mar 21 04:54:00 crc kubenswrapper[4839]: I0321 04:54:00.208739 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wmrq\" (UniqueName: \"kubernetes.io/projected/852785cf-c79d-4c8e-92f0-f15d9836b437-kube-api-access-7wmrq\") pod \"auto-csr-approver-29567814-q8zxw\" (UID: \"852785cf-c79d-4c8e-92f0-f15d9836b437\") " pod="openshift-infra/auto-csr-approver-29567814-q8zxw" Mar 21 04:54:00 crc kubenswrapper[4839]: I0321 04:54:00.312256 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wmrq\" (UniqueName: \"kubernetes.io/projected/852785cf-c79d-4c8e-92f0-f15d9836b437-kube-api-access-7wmrq\") pod \"auto-csr-approver-29567814-q8zxw\" (UID: \"852785cf-c79d-4c8e-92f0-f15d9836b437\") " pod="openshift-infra/auto-csr-approver-29567814-q8zxw" Mar 21 04:54:00 crc kubenswrapper[4839]: I0321 04:54:00.331430 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wmrq\" (UniqueName: \"kubernetes.io/projected/852785cf-c79d-4c8e-92f0-f15d9836b437-kube-api-access-7wmrq\") pod \"auto-csr-approver-29567814-q8zxw\" (UID: \"852785cf-c79d-4c8e-92f0-f15d9836b437\") " pod="openshift-infra/auto-csr-approver-29567814-q8zxw" Mar 21 04:54:00 crc kubenswrapper[4839]: I0321 04:54:00.585902 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567814-q8zxw" Mar 21 04:54:01 crc kubenswrapper[4839]: I0321 04:54:01.006236 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567814-q8zxw"] Mar 21 04:54:01 crc kubenswrapper[4839]: W0321 04:54:01.013094 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod852785cf_c79d_4c8e_92f0_f15d9836b437.slice/crio-aa429bb7267da47a516aa432749d3abd8d7562f8f699fd66320b5fb06dbe002d WatchSource:0}: Error finding container aa429bb7267da47a516aa432749d3abd8d7562f8f699fd66320b5fb06dbe002d: Status 404 returned error can't find the container with id aa429bb7267da47a516aa432749d3abd8d7562f8f699fd66320b5fb06dbe002d Mar 21 04:54:01 crc kubenswrapper[4839]: I0321 04:54:01.533762 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567814-q8zxw" event={"ID":"852785cf-c79d-4c8e-92f0-f15d9836b437","Type":"ContainerStarted","Data":"aa429bb7267da47a516aa432749d3abd8d7562f8f699fd66320b5fb06dbe002d"} Mar 21 04:54:03 crc kubenswrapper[4839]: I0321 04:54:03.550518 4839 generic.go:334] "Generic (PLEG): container finished" podID="852785cf-c79d-4c8e-92f0-f15d9836b437" containerID="d122e9d27915a31245552d8140bcb2b6f44ab9e8f5d0f2da420a748e2a0ab38c" exitCode=0 Mar 21 04:54:03 crc kubenswrapper[4839]: I0321 04:54:03.550695 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567814-q8zxw" event={"ID":"852785cf-c79d-4c8e-92f0-f15d9836b437","Type":"ContainerDied","Data":"d122e9d27915a31245552d8140bcb2b6f44ab9e8f5d0f2da420a748e2a0ab38c"} Mar 21 04:54:04 crc kubenswrapper[4839]: I0321 04:54:04.900690 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567814-q8zxw" Mar 21 04:54:05 crc kubenswrapper[4839]: I0321 04:54:05.102420 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wmrq\" (UniqueName: \"kubernetes.io/projected/852785cf-c79d-4c8e-92f0-f15d9836b437-kube-api-access-7wmrq\") pod \"852785cf-c79d-4c8e-92f0-f15d9836b437\" (UID: \"852785cf-c79d-4c8e-92f0-f15d9836b437\") " Mar 21 04:54:05 crc kubenswrapper[4839]: I0321 04:54:05.109771 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/852785cf-c79d-4c8e-92f0-f15d9836b437-kube-api-access-7wmrq" (OuterVolumeSpecName: "kube-api-access-7wmrq") pod "852785cf-c79d-4c8e-92f0-f15d9836b437" (UID: "852785cf-c79d-4c8e-92f0-f15d9836b437"). InnerVolumeSpecName "kube-api-access-7wmrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:54:05 crc kubenswrapper[4839]: I0321 04:54:05.205073 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wmrq\" (UniqueName: \"kubernetes.io/projected/852785cf-c79d-4c8e-92f0-f15d9836b437-kube-api-access-7wmrq\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:05 crc kubenswrapper[4839]: I0321 04:54:05.569589 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567814-q8zxw" event={"ID":"852785cf-c79d-4c8e-92f0-f15d9836b437","Type":"ContainerDied","Data":"aa429bb7267da47a516aa432749d3abd8d7562f8f699fd66320b5fb06dbe002d"} Mar 21 04:54:05 crc kubenswrapper[4839]: I0321 04:54:05.569640 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa429bb7267da47a516aa432749d3abd8d7562f8f699fd66320b5fb06dbe002d" Mar 21 04:54:05 crc kubenswrapper[4839]: I0321 04:54:05.569638 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567814-q8zxw" Mar 21 04:54:05 crc kubenswrapper[4839]: I0321 04:54:05.961170 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567808-pxvv9"] Mar 21 04:54:05 crc kubenswrapper[4839]: I0321 04:54:05.976507 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567808-pxvv9"] Mar 21 04:54:06 crc kubenswrapper[4839]: I0321 04:54:06.464035 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:54:06 crc kubenswrapper[4839]: E0321 04:54:06.464646 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:54:06 crc kubenswrapper[4839]: I0321 04:54:06.467318 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d56af53-fce2-4320-b4fa-32b5c6798921" path="/var/lib/kubelet/pods/5d56af53-fce2-4320-b4fa-32b5c6798921/volumes" Mar 21 04:54:16 crc kubenswrapper[4839]: I0321 04:54:16.030261 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-ng2tw"] Mar 21 04:54:16 crc kubenswrapper[4839]: I0321 04:54:16.039756 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-ng2tw"] Mar 21 04:54:16 crc kubenswrapper[4839]: I0321 04:54:16.465125 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cc1dfb9-8108-46e5-8dc5-5b555590ecc1" path="/var/lib/kubelet/pods/2cc1dfb9-8108-46e5-8dc5-5b555590ecc1/volumes" Mar 21 04:54:19 crc kubenswrapper[4839]: I0321 04:54:19.452970 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:54:19 crc kubenswrapper[4839]: E0321 04:54:19.454865 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:54:33 crc kubenswrapper[4839]: I0321 04:54:33.037506 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-nm9t5"] Mar 21 04:54:33 crc kubenswrapper[4839]: I0321 04:54:33.045676 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-nm9t5"] Mar 21 04:54:34 crc kubenswrapper[4839]: I0321 04:54:34.452970 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:54:34 crc kubenswrapper[4839]: E0321 04:54:34.453286 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:54:34 crc kubenswrapper[4839]: I0321 04:54:34.463919 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="625a99bd-bc01-400e-8e9c-1f5eff390466" path="/var/lib/kubelet/pods/625a99bd-bc01-400e-8e9c-1f5eff390466/volumes" Mar 21 04:54:42 crc kubenswrapper[4839]: I0321 04:54:42.040636 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-wdddk"] Mar 21 04:54:42 crc kubenswrapper[4839]: I0321 04:54:42.049900 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-wdddk"] Mar 21 04:54:42 crc kubenswrapper[4839]: I0321 04:54:42.464034 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6e87cbd-1f46-4fa0-9529-8250f9fee21c" path="/var/lib/kubelet/pods/e6e87cbd-1f46-4fa0-9529-8250f9fee21c/volumes" Mar 21 04:54:45 crc kubenswrapper[4839]: I0321 04:54:45.028092 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-t8kxj"] Mar 21 04:54:45 crc kubenswrapper[4839]: I0321 04:54:45.039110 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ts52d"] Mar 21 04:54:45 crc kubenswrapper[4839]: I0321 04:54:45.050476 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-t8kxj"] Mar 21 04:54:45 crc kubenswrapper[4839]: I0321 04:54:45.059731 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ts52d"] Mar 21 04:54:46 crc kubenswrapper[4839]: I0321 04:54:46.460132 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:54:46 crc kubenswrapper[4839]: E0321 04:54:46.460411 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:54:46 crc kubenswrapper[4839]: I0321 04:54:46.462925 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d0e1745-6e0b-475c-a1de-d049018abea6" path="/var/lib/kubelet/pods/6d0e1745-6e0b-475c-a1de-d049018abea6/volumes" Mar 21 04:54:46 crc kubenswrapper[4839]: I0321 04:54:46.463523 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cada35b-7e7f-4d22-895f-588b90e48c70" path="/var/lib/kubelet/pods/7cada35b-7e7f-4d22-895f-588b90e48c70/volumes" Mar 21 04:54:54 crc kubenswrapper[4839]: I0321 04:54:54.367037 4839 scope.go:117] "RemoveContainer" containerID="3f2d4fa09933468a7b6e88aaba055705019ffd1468416047f45f0ae828c805fe" Mar 21 04:54:54 crc kubenswrapper[4839]: I0321 04:54:54.403538 4839 scope.go:117] "RemoveContainer" containerID="fe7545d66419e9d11543f534eecf214e1fa485d02ad773333c092ee39cadde88" Mar 21 04:54:54 crc kubenswrapper[4839]: I0321 04:54:54.452910 4839 scope.go:117] "RemoveContainer" containerID="79604402661ee3c465cb72ff146dbc568553c3204385175c4f68e9dccfa5a6c6" Mar 21 04:54:54 crc kubenswrapper[4839]: I0321 04:54:54.491838 4839 scope.go:117] "RemoveContainer" containerID="848904d0e2ac99454595812a77ae5d4f4ec6aacc9198508a3ea49e5fd72d6ee4" Mar 21 04:54:54 crc kubenswrapper[4839]: I0321 04:54:54.540457 4839 scope.go:117] "RemoveContainer" containerID="143fbf65afa2773912765c6bb85681ce2740b19aa556d5df9884eb40a87ddf95" Mar 21 04:54:54 crc kubenswrapper[4839]: I0321 04:54:54.586442 4839 scope.go:117] "RemoveContainer" containerID="dfcec3a2306ecb1c0b0e9a1bd05577683dcbd7efc3319d4ee942c6e22862d913" Mar 21 04:54:54 crc kubenswrapper[4839]: I0321 04:54:54.640939 4839 scope.go:117] "RemoveContainer" containerID="412e0d9615c7dcab7728f617fda54216ecfc01e31d3155750522d0825a7d167a" Mar 21 04:54:56 crc kubenswrapper[4839]: I0321 04:54:56.593195 4839 generic.go:334] "Generic (PLEG): container finished" podID="a58d82e4-2de9-4680-a08c-6eeb775ed08a" containerID="2ff51247bd02a82c38abca9599a8bd0159eda3e65cc9e732eaaf1569f9b29e29" exitCode=0 Mar 21 04:54:56 crc kubenswrapper[4839]: I0321 04:54:56.593298 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" event={"ID":"a58d82e4-2de9-4680-a08c-6eeb775ed08a","Type":"ContainerDied","Data":"2ff51247bd02a82c38abca9599a8bd0159eda3e65cc9e732eaaf1569f9b29e29"} Mar 21 04:54:57 crc kubenswrapper[4839]: I0321 04:54:57.453252 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:54:57 crc kubenswrapper[4839]: E0321 04:54:57.453510 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.003435 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.126870 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-ssh-key-openstack-edpm-ipam\") pod \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.127290 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hxd9\" (UniqueName: \"kubernetes.io/projected/a58d82e4-2de9-4680-a08c-6eeb775ed08a-kube-api-access-9hxd9\") pod \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.127549 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-inventory\") pod \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\" (UID: \"a58d82e4-2de9-4680-a08c-6eeb775ed08a\") " Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.132251 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a58d82e4-2de9-4680-a08c-6eeb775ed08a-kube-api-access-9hxd9" (OuterVolumeSpecName: "kube-api-access-9hxd9") pod "a58d82e4-2de9-4680-a08c-6eeb775ed08a" (UID: "a58d82e4-2de9-4680-a08c-6eeb775ed08a"). InnerVolumeSpecName "kube-api-access-9hxd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.156383 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a58d82e4-2de9-4680-a08c-6eeb775ed08a" (UID: "a58d82e4-2de9-4680-a08c-6eeb775ed08a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.157341 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-inventory" (OuterVolumeSpecName: "inventory") pod "a58d82e4-2de9-4680-a08c-6eeb775ed08a" (UID: "a58d82e4-2de9-4680-a08c-6eeb775ed08a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.230313 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.230622 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a58d82e4-2de9-4680-a08c-6eeb775ed08a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.230716 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hxd9\" (UniqueName: \"kubernetes.io/projected/a58d82e4-2de9-4680-a08c-6eeb775ed08a-kube-api-access-9hxd9\") on node \"crc\" DevicePath \"\"" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.611527 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" event={"ID":"a58d82e4-2de9-4680-a08c-6eeb775ed08a","Type":"ContainerDied","Data":"ef1d592a6aae72b1b9bb235f77d743e0bad4065ccda22b837001cbeb2d26cd16"} Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.611588 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.611607 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef1d592a6aae72b1b9bb235f77d743e0bad4065ccda22b837001cbeb2d26cd16" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.695545 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h"] Mar 21 04:54:58 crc kubenswrapper[4839]: E0321 04:54:58.695963 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="852785cf-c79d-4c8e-92f0-f15d9836b437" containerName="oc" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.695998 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="852785cf-c79d-4c8e-92f0-f15d9836b437" containerName="oc" Mar 21 04:54:58 crc kubenswrapper[4839]: E0321 04:54:58.696024 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a58d82e4-2de9-4680-a08c-6eeb775ed08a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.696031 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="a58d82e4-2de9-4680-a08c-6eeb775ed08a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.696335 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="a58d82e4-2de9-4680-a08c-6eeb775ed08a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.696351 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="852785cf-c79d-4c8e-92f0-f15d9836b437" containerName="oc" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.697107 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.698559 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.699272 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.699310 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.699397 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.706999 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h"] Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.842643 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gzp4\" (UniqueName: \"kubernetes.io/projected/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-kube-api-access-7gzp4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.842851 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.843204 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.944902 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.944997 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gzp4\" (UniqueName: \"kubernetes.io/projected/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-kube-api-access-7gzp4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.945049 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.948878 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.949040 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:58 crc kubenswrapper[4839]: I0321 04:54:58.966675 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gzp4\" (UniqueName: \"kubernetes.io/projected/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-kube-api-access-7gzp4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:59 crc kubenswrapper[4839]: I0321 04:54:59.018679 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:54:59 crc kubenswrapper[4839]: I0321 04:54:59.506275 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h"] Mar 21 04:54:59 crc kubenswrapper[4839]: I0321 04:54:59.619583 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" event={"ID":"f9d60b3b-b1b4-4d98-9da2-e152ac410c81","Type":"ContainerStarted","Data":"eadb9ad06806647ab9e379d0ae46ee8dc799d857833f8f6433a7547d0d7d61d7"} Mar 21 04:55:00 crc kubenswrapper[4839]: I0321 04:55:00.631203 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" event={"ID":"f9d60b3b-b1b4-4d98-9da2-e152ac410c81","Type":"ContainerStarted","Data":"79265d762afb243298ce5f51d33dbbb4aef590602f5e1376cea25cefa028f66d"} Mar 21 04:55:00 crc kubenswrapper[4839]: I0321 04:55:00.651471 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" podStartSLOduration=2.246753007 podStartE2EDuration="2.65145288s" podCreationTimestamp="2026-03-21 04:54:58 +0000 UTC" firstStartedPulling="2026-03-21 04:54:59.511773588 +0000 UTC m=+1903.839560254" lastFinishedPulling="2026-03-21 04:54:59.916473451 +0000 UTC m=+1904.244260127" observedRunningTime="2026-03-21 04:55:00.647992052 +0000 UTC m=+1904.975778738" watchObservedRunningTime="2026-03-21 04:55:00.65145288 +0000 UTC m=+1904.979239556" Mar 21 04:55:04 crc kubenswrapper[4839]: I0321 04:55:04.663343 4839 generic.go:334] "Generic (PLEG): container finished" podID="f9d60b3b-b1b4-4d98-9da2-e152ac410c81" containerID="79265d762afb243298ce5f51d33dbbb4aef590602f5e1376cea25cefa028f66d" exitCode=0 Mar 21 04:55:04 crc kubenswrapper[4839]: I0321 04:55:04.663630 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" event={"ID":"f9d60b3b-b1b4-4d98-9da2-e152ac410c81","Type":"ContainerDied","Data":"79265d762afb243298ce5f51d33dbbb4aef590602f5e1376cea25cefa028f66d"} Mar 21 04:55:05 crc kubenswrapper[4839]: I0321 04:55:05.043914 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-qfjms"] Mar 21 04:55:05 crc kubenswrapper[4839]: I0321 04:55:05.053034 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-qfjms"] Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.033988 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.177914 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-ssh-key-openstack-edpm-ipam\") pod \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.178098 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gzp4\" (UniqueName: \"kubernetes.io/projected/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-kube-api-access-7gzp4\") pod \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.178131 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-inventory\") pod \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\" (UID: \"f9d60b3b-b1b4-4d98-9da2-e152ac410c81\") " Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.183871 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-kube-api-access-7gzp4" (OuterVolumeSpecName: "kube-api-access-7gzp4") pod "f9d60b3b-b1b4-4d98-9da2-e152ac410c81" (UID: "f9d60b3b-b1b4-4d98-9da2-e152ac410c81"). InnerVolumeSpecName "kube-api-access-7gzp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.205592 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f9d60b3b-b1b4-4d98-9da2-e152ac410c81" (UID: "f9d60b3b-b1b4-4d98-9da2-e152ac410c81"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.206187 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-inventory" (OuterVolumeSpecName: "inventory") pod "f9d60b3b-b1b4-4d98-9da2-e152ac410c81" (UID: "f9d60b3b-b1b4-4d98-9da2-e152ac410c81"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.281493 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gzp4\" (UniqueName: \"kubernetes.io/projected/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-kube-api-access-7gzp4\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.281542 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.281556 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9d60b3b-b1b4-4d98-9da2-e152ac410c81-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.465307 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6000d2d4-e84a-443f-9094-ab999541331d" path="/var/lib/kubelet/pods/6000d2d4-e84a-443f-9094-ab999541331d/volumes" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.679274 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" event={"ID":"f9d60b3b-b1b4-4d98-9da2-e152ac410c81","Type":"ContainerDied","Data":"eadb9ad06806647ab9e379d0ae46ee8dc799d857833f8f6433a7547d0d7d61d7"} Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.679324 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eadb9ad06806647ab9e379d0ae46ee8dc799d857833f8f6433a7547d0d7d61d7" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.679826 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.817320 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2"] Mar 21 04:55:06 crc kubenswrapper[4839]: E0321 04:55:06.817720 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9d60b3b-b1b4-4d98-9da2-e152ac410c81" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.817739 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9d60b3b-b1b4-4d98-9da2-e152ac410c81" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.817944 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9d60b3b-b1b4-4d98-9da2-e152ac410c81" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.819333 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.823339 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.823584 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.823697 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.823801 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.833666 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2"] Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.898370 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xdvx2\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.898947 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8tht\" (UniqueName: \"kubernetes.io/projected/7538d496-3768-42b7-9f2e-70e1b44a9d6b-kube-api-access-v8tht\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xdvx2\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:06 crc kubenswrapper[4839]: I0321 04:55:06.899179 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xdvx2\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:07 crc kubenswrapper[4839]: I0321 04:55:07.000450 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xdvx2\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:07 crc kubenswrapper[4839]: I0321 04:55:07.000722 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xdvx2\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:07 crc kubenswrapper[4839]: I0321 04:55:07.001033 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8tht\" (UniqueName: \"kubernetes.io/projected/7538d496-3768-42b7-9f2e-70e1b44a9d6b-kube-api-access-v8tht\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xdvx2\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:07 crc kubenswrapper[4839]: I0321 04:55:07.005118 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xdvx2\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:07 crc kubenswrapper[4839]: I0321 04:55:07.005162 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xdvx2\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:07 crc kubenswrapper[4839]: I0321 04:55:07.020843 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8tht\" (UniqueName: \"kubernetes.io/projected/7538d496-3768-42b7-9f2e-70e1b44a9d6b-kube-api-access-v8tht\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xdvx2\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:07 crc kubenswrapper[4839]: I0321 04:55:07.153117 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:07 crc kubenswrapper[4839]: I0321 04:55:07.821956 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2"] Mar 21 04:55:07 crc kubenswrapper[4839]: I0321 04:55:07.834189 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 04:55:08 crc kubenswrapper[4839]: I0321 04:55:08.695526 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" event={"ID":"7538d496-3768-42b7-9f2e-70e1b44a9d6b","Type":"ContainerStarted","Data":"c2bed0a99a23391c3bf403e5b41349a8afe66f05e34f738cc9341c78027bd20e"} Mar 21 04:55:09 crc kubenswrapper[4839]: I0321 04:55:09.704586 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" event={"ID":"7538d496-3768-42b7-9f2e-70e1b44a9d6b","Type":"ContainerStarted","Data":"3c6cee74975474902b422479025952e542c55d69697ddabc5338d76b71ec669a"} Mar 21 04:55:09 crc kubenswrapper[4839]: I0321 04:55:09.727827 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" podStartSLOduration=3.082889663 podStartE2EDuration="3.727785664s" podCreationTimestamp="2026-03-21 04:55:06 +0000 UTC" firstStartedPulling="2026-03-21 04:55:07.833906351 +0000 UTC m=+1912.161693027" lastFinishedPulling="2026-03-21 04:55:08.478802352 +0000 UTC m=+1912.806589028" observedRunningTime="2026-03-21 04:55:09.722083233 +0000 UTC m=+1914.049869909" watchObservedRunningTime="2026-03-21 04:55:09.727785664 +0000 UTC m=+1914.055572340" Mar 21 04:55:11 crc kubenswrapper[4839]: I0321 04:55:11.453703 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:55:11 crc kubenswrapper[4839]: E0321 04:55:11.454305 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:55:24 crc kubenswrapper[4839]: I0321 04:55:24.453226 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:55:24 crc kubenswrapper[4839]: E0321 04:55:24.453917 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:55:39 crc kubenswrapper[4839]: I0321 04:55:39.453425 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:55:39 crc kubenswrapper[4839]: E0321 04:55:39.454313 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:55:40 crc kubenswrapper[4839]: I0321 04:55:40.949707 4839 generic.go:334] "Generic (PLEG): container finished" podID="7538d496-3768-42b7-9f2e-70e1b44a9d6b" containerID="3c6cee74975474902b422479025952e542c55d69697ddabc5338d76b71ec669a" exitCode=0 Mar 21 04:55:40 crc kubenswrapper[4839]: I0321 04:55:40.949758 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" event={"ID":"7538d496-3768-42b7-9f2e-70e1b44a9d6b","Type":"ContainerDied","Data":"3c6cee74975474902b422479025952e542c55d69697ddabc5338d76b71ec669a"} Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.325639 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.478249 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8tht\" (UniqueName: \"kubernetes.io/projected/7538d496-3768-42b7-9f2e-70e1b44a9d6b-kube-api-access-v8tht\") pod \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.478654 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-inventory\") pod \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.478758 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-ssh-key-openstack-edpm-ipam\") pod \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\" (UID: \"7538d496-3768-42b7-9f2e-70e1b44a9d6b\") " Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.484028 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7538d496-3768-42b7-9f2e-70e1b44a9d6b-kube-api-access-v8tht" (OuterVolumeSpecName: "kube-api-access-v8tht") pod "7538d496-3768-42b7-9f2e-70e1b44a9d6b" (UID: "7538d496-3768-42b7-9f2e-70e1b44a9d6b"). InnerVolumeSpecName "kube-api-access-v8tht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.505197 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7538d496-3768-42b7-9f2e-70e1b44a9d6b" (UID: "7538d496-3768-42b7-9f2e-70e1b44a9d6b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.527816 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-inventory" (OuterVolumeSpecName: "inventory") pod "7538d496-3768-42b7-9f2e-70e1b44a9d6b" (UID: "7538d496-3768-42b7-9f2e-70e1b44a9d6b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.581309 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.581343 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7538d496-3768-42b7-9f2e-70e1b44a9d6b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.581358 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8tht\" (UniqueName: \"kubernetes.io/projected/7538d496-3768-42b7-9f2e-70e1b44a9d6b-kube-api-access-v8tht\") on node \"crc\" DevicePath \"\"" Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.968147 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" event={"ID":"7538d496-3768-42b7-9f2e-70e1b44a9d6b","Type":"ContainerDied","Data":"c2bed0a99a23391c3bf403e5b41349a8afe66f05e34f738cc9341c78027bd20e"} Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.968201 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2bed0a99a23391c3bf403e5b41349a8afe66f05e34f738cc9341c78027bd20e" Mar 21 04:55:42 crc kubenswrapper[4839]: I0321 04:55:42.968218 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xdvx2" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.054499 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf"] Mar 21 04:55:43 crc kubenswrapper[4839]: E0321 04:55:43.054952 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7538d496-3768-42b7-9f2e-70e1b44a9d6b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.054972 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="7538d496-3768-42b7-9f2e-70e1b44a9d6b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.055184 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="7538d496-3768-42b7-9f2e-70e1b44a9d6b" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.055927 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.058466 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.058656 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.059673 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.059760 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.076936 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf"] Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.092602 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blxs6\" (UniqueName: \"kubernetes.io/projected/ab9d4433-fe0e-471b-84f8-568b31920ed3-kube-api-access-blxs6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qkclf\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.092650 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qkclf\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.092735 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qkclf\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.194375 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qkclf\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.194541 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blxs6\" (UniqueName: \"kubernetes.io/projected/ab9d4433-fe0e-471b-84f8-568b31920ed3-kube-api-access-blxs6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qkclf\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.194588 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qkclf\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.198354 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qkclf\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.198454 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qkclf\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.209840 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blxs6\" (UniqueName: \"kubernetes.io/projected/ab9d4433-fe0e-471b-84f8-568b31920ed3-kube-api-access-blxs6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qkclf\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.376423 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.901162 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf"] Mar 21 04:55:43 crc kubenswrapper[4839]: I0321 04:55:43.977530 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" event={"ID":"ab9d4433-fe0e-471b-84f8-568b31920ed3","Type":"ContainerStarted","Data":"4d2611a14110b17c867ac61b555f142cd249467327dd49d330816ffb10a58194"} Mar 21 04:55:44 crc kubenswrapper[4839]: I0321 04:55:44.987507 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" event={"ID":"ab9d4433-fe0e-471b-84f8-568b31920ed3","Type":"ContainerStarted","Data":"da612acf6dc5607ae4b2dde018284e7ea25cf4afa5712325128aa32f44b461be"} Mar 21 04:55:45 crc kubenswrapper[4839]: I0321 04:55:45.012323 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" podStartSLOduration=1.586092883 podStartE2EDuration="2.012299612s" podCreationTimestamp="2026-03-21 04:55:43 +0000 UTC" firstStartedPulling="2026-03-21 04:55:43.900335291 +0000 UTC m=+1948.228121977" lastFinishedPulling="2026-03-21 04:55:44.32654203 +0000 UTC m=+1948.654328706" observedRunningTime="2026-03-21 04:55:45.003239456 +0000 UTC m=+1949.331026152" watchObservedRunningTime="2026-03-21 04:55:45.012299612 +0000 UTC m=+1949.340086288" Mar 21 04:55:46 crc kubenswrapper[4839]: I0321 04:55:46.047495 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-4zz89"] Mar 21 04:55:46 crc kubenswrapper[4839]: I0321 04:55:46.057498 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-48d5-account-create-update-5k79b"] Mar 21 04:55:46 crc kubenswrapper[4839]: I0321 04:55:46.066534 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-48d5-account-create-update-5k79b"] Mar 21 04:55:46 crc kubenswrapper[4839]: I0321 04:55:46.074560 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-4zz89"] Mar 21 04:55:46 crc kubenswrapper[4839]: I0321 04:55:46.465069 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b76e9253-1495-42d5-910f-cce6f2730243" path="/var/lib/kubelet/pods/b76e9253-1495-42d5-910f-cce6f2730243/volumes" Mar 21 04:55:46 crc kubenswrapper[4839]: I0321 04:55:46.466420 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f481fb0d-ac2f-4989-a547-50f5081e4e78" path="/var/lib/kubelet/pods/f481fb0d-ac2f-4989-a547-50f5081e4e78/volumes" Mar 21 04:55:47 crc kubenswrapper[4839]: I0321 04:55:47.034367 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-94b7-account-create-update-zmpzr"] Mar 21 04:55:47 crc kubenswrapper[4839]: I0321 04:55:47.044700 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-ds7tq"] Mar 21 04:55:47 crc kubenswrapper[4839]: I0321 04:55:47.054695 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-46c8-account-create-update-mp8jl"] Mar 21 04:55:47 crc kubenswrapper[4839]: I0321 04:55:47.064910 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-w9wx6"] Mar 21 04:55:47 crc kubenswrapper[4839]: I0321 04:55:47.073730 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-46c8-account-create-update-mp8jl"] Mar 21 04:55:47 crc kubenswrapper[4839]: I0321 04:55:47.081972 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-ds7tq"] Mar 21 04:55:47 crc kubenswrapper[4839]: I0321 04:55:47.089829 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-94b7-account-create-update-zmpzr"] Mar 21 04:55:47 crc kubenswrapper[4839]: I0321 04:55:47.097761 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-w9wx6"] Mar 21 04:55:48 crc kubenswrapper[4839]: I0321 04:55:48.470443 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4185a56e-9d10-4aea-ad84-a865dff3e6be" path="/var/lib/kubelet/pods/4185a56e-9d10-4aea-ad84-a865dff3e6be/volumes" Mar 21 04:55:48 crc kubenswrapper[4839]: I0321 04:55:48.471369 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46c56098-2959-4bd0-b762-36a4ee1bb2e6" path="/var/lib/kubelet/pods/46c56098-2959-4bd0-b762-36a4ee1bb2e6/volumes" Mar 21 04:55:48 crc kubenswrapper[4839]: I0321 04:55:48.471999 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60534a44-1538-4bdb-81d1-043c9ae84cee" path="/var/lib/kubelet/pods/60534a44-1538-4bdb-81d1-043c9ae84cee/volumes" Mar 21 04:55:48 crc kubenswrapper[4839]: I0321 04:55:48.472526 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9220ed3c-2e97-4efc-a4cc-28bb29774ad8" path="/var/lib/kubelet/pods/9220ed3c-2e97-4efc-a4cc-28bb29774ad8/volumes" Mar 21 04:55:54 crc kubenswrapper[4839]: I0321 04:55:54.453982 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:55:54 crc kubenswrapper[4839]: E0321 04:55:54.454710 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:55:54 crc kubenswrapper[4839]: I0321 04:55:54.788325 4839 scope.go:117] "RemoveContainer" containerID="89d53502805454e28eeac8a6f5794fb2c1a2eba3acba95c45fd4f0d839ae56ac" Mar 21 04:55:54 crc kubenswrapper[4839]: I0321 04:55:54.861532 4839 scope.go:117] "RemoveContainer" containerID="ccd22af7723d538ca33a42ba3654ebdb55e8713c02134e6ab93cc893ad28c76a" Mar 21 04:55:54 crc kubenswrapper[4839]: I0321 04:55:54.898883 4839 scope.go:117] "RemoveContainer" containerID="2de908b5bd6bba55215cf326e7323c0123b89a96311bd62e86b355ee0ff19bc1" Mar 21 04:55:54 crc kubenswrapper[4839]: I0321 04:55:54.925344 4839 scope.go:117] "RemoveContainer" containerID="d3d91629ebc8060afc821dc6f6ff1f1f4f9eb9613514c223b3a39c31ccd40e5c" Mar 21 04:55:54 crc kubenswrapper[4839]: I0321 04:55:54.974051 4839 scope.go:117] "RemoveContainer" containerID="296c6956b7a45c772d2bc75858a9b2db91782289c6cf30854b24fd106bb5d692" Mar 21 04:55:55 crc kubenswrapper[4839]: I0321 04:55:55.018631 4839 scope.go:117] "RemoveContainer" containerID="9c0964d074027bd8b20f7561440904d50f0f5ad70eed7435ed8da532c09da947" Mar 21 04:55:55 crc kubenswrapper[4839]: I0321 04:55:55.052462 4839 scope.go:117] "RemoveContainer" containerID="c7f784ce54bb50fe64fb506149fb81059511360e35c18d47126e20bcbe758d00" Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.140392 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567816-8qfld"] Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.142291 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567816-8qfld" Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.144354 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.144725 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.145006 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.150818 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567816-8qfld"] Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.243089 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjmcj\" (UniqueName: \"kubernetes.io/projected/f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8-kube-api-access-qjmcj\") pod \"auto-csr-approver-29567816-8qfld\" (UID: \"f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8\") " pod="openshift-infra/auto-csr-approver-29567816-8qfld" Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.344773 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjmcj\" (UniqueName: \"kubernetes.io/projected/f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8-kube-api-access-qjmcj\") pod \"auto-csr-approver-29567816-8qfld\" (UID: \"f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8\") " pod="openshift-infra/auto-csr-approver-29567816-8qfld" Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.363335 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjmcj\" (UniqueName: \"kubernetes.io/projected/f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8-kube-api-access-qjmcj\") pod \"auto-csr-approver-29567816-8qfld\" (UID: \"f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8\") " pod="openshift-infra/auto-csr-approver-29567816-8qfld" Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.464558 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567816-8qfld" Mar 21 04:56:00 crc kubenswrapper[4839]: I0321 04:56:00.891871 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567816-8qfld"] Mar 21 04:56:01 crc kubenswrapper[4839]: I0321 04:56:01.117949 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567816-8qfld" event={"ID":"f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8","Type":"ContainerStarted","Data":"02aac1e91b6557da8f5a23c314a49628a032623832ecd33d29794d3712751930"} Mar 21 04:56:03 crc kubenswrapper[4839]: I0321 04:56:03.134934 4839 generic.go:334] "Generic (PLEG): container finished" podID="f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8" containerID="4fe2426cb283c93b9728be8cbc10600e5f92f98c8d9cf9800594541cb0424886" exitCode=0 Mar 21 04:56:03 crc kubenswrapper[4839]: I0321 04:56:03.135110 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567816-8qfld" event={"ID":"f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8","Type":"ContainerDied","Data":"4fe2426cb283c93b9728be8cbc10600e5f92f98c8d9cf9800594541cb0424886"} Mar 21 04:56:04 crc kubenswrapper[4839]: I0321 04:56:04.463822 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567816-8qfld" Mar 21 04:56:04 crc kubenswrapper[4839]: I0321 04:56:04.628464 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjmcj\" (UniqueName: \"kubernetes.io/projected/f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8-kube-api-access-qjmcj\") pod \"f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8\" (UID: \"f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8\") " Mar 21 04:56:04 crc kubenswrapper[4839]: I0321 04:56:04.636578 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8-kube-api-access-qjmcj" (OuterVolumeSpecName: "kube-api-access-qjmcj") pod "f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8" (UID: "f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8"). InnerVolumeSpecName "kube-api-access-qjmcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:56:04 crc kubenswrapper[4839]: I0321 04:56:04.731219 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjmcj\" (UniqueName: \"kubernetes.io/projected/f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8-kube-api-access-qjmcj\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:05 crc kubenswrapper[4839]: I0321 04:56:05.151369 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567816-8qfld" event={"ID":"f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8","Type":"ContainerDied","Data":"02aac1e91b6557da8f5a23c314a49628a032623832ecd33d29794d3712751930"} Mar 21 04:56:05 crc kubenswrapper[4839]: I0321 04:56:05.151402 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567816-8qfld" Mar 21 04:56:05 crc kubenswrapper[4839]: I0321 04:56:05.151416 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02aac1e91b6557da8f5a23c314a49628a032623832ecd33d29794d3712751930" Mar 21 04:56:05 crc kubenswrapper[4839]: I0321 04:56:05.527926 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567810-hr2wf"] Mar 21 04:56:05 crc kubenswrapper[4839]: I0321 04:56:05.535394 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567810-hr2wf"] Mar 21 04:56:06 crc kubenswrapper[4839]: I0321 04:56:06.457895 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:56:06 crc kubenswrapper[4839]: E0321 04:56:06.458522 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:56:06 crc kubenswrapper[4839]: I0321 04:56:06.462496 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf9d6591-e9e7-485d-96f3-8f32958ac530" path="/var/lib/kubelet/pods/cf9d6591-e9e7-485d-96f3-8f32958ac530/volumes" Mar 21 04:56:19 crc kubenswrapper[4839]: I0321 04:56:19.034715 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5dvtr"] Mar 21 04:56:19 crc kubenswrapper[4839]: I0321 04:56:19.043519 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5dvtr"] Mar 21 04:56:20 crc kubenswrapper[4839]: I0321 04:56:20.453445 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:56:20 crc kubenswrapper[4839]: E0321 04:56:20.454158 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:56:20 crc kubenswrapper[4839]: I0321 04:56:20.466595 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbaf057c-375e-4da6-a7cd-8c879a51ff50" path="/var/lib/kubelet/pods/bbaf057c-375e-4da6-a7cd-8c879a51ff50/volumes" Mar 21 04:56:29 crc kubenswrapper[4839]: I0321 04:56:29.359666 4839 generic.go:334] "Generic (PLEG): container finished" podID="ab9d4433-fe0e-471b-84f8-568b31920ed3" containerID="da612acf6dc5607ae4b2dde018284e7ea25cf4afa5712325128aa32f44b461be" exitCode=0 Mar 21 04:56:29 crc kubenswrapper[4839]: I0321 04:56:29.359872 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" event={"ID":"ab9d4433-fe0e-471b-84f8-568b31920ed3","Type":"ContainerDied","Data":"da612acf6dc5607ae4b2dde018284e7ea25cf4afa5712325128aa32f44b461be"} Mar 21 04:56:30 crc kubenswrapper[4839]: I0321 04:56:30.777069 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:56:30 crc kubenswrapper[4839]: I0321 04:56:30.828252 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-ssh-key-openstack-edpm-ipam\") pod \"ab9d4433-fe0e-471b-84f8-568b31920ed3\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " Mar 21 04:56:30 crc kubenswrapper[4839]: I0321 04:56:30.828447 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-inventory\") pod \"ab9d4433-fe0e-471b-84f8-568b31920ed3\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " Mar 21 04:56:30 crc kubenswrapper[4839]: I0321 04:56:30.828483 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blxs6\" (UniqueName: \"kubernetes.io/projected/ab9d4433-fe0e-471b-84f8-568b31920ed3-kube-api-access-blxs6\") pod \"ab9d4433-fe0e-471b-84f8-568b31920ed3\" (UID: \"ab9d4433-fe0e-471b-84f8-568b31920ed3\") " Mar 21 04:56:30 crc kubenswrapper[4839]: I0321 04:56:30.847785 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab9d4433-fe0e-471b-84f8-568b31920ed3-kube-api-access-blxs6" (OuterVolumeSpecName: "kube-api-access-blxs6") pod "ab9d4433-fe0e-471b-84f8-568b31920ed3" (UID: "ab9d4433-fe0e-471b-84f8-568b31920ed3"). InnerVolumeSpecName "kube-api-access-blxs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:56:30 crc kubenswrapper[4839]: I0321 04:56:30.861399 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ab9d4433-fe0e-471b-84f8-568b31920ed3" (UID: "ab9d4433-fe0e-471b-84f8-568b31920ed3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:56:30 crc kubenswrapper[4839]: I0321 04:56:30.866697 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-inventory" (OuterVolumeSpecName: "inventory") pod "ab9d4433-fe0e-471b-84f8-568b31920ed3" (UID: "ab9d4433-fe0e-471b-84f8-568b31920ed3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:56:30 crc kubenswrapper[4839]: I0321 04:56:30.931235 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:30 crc kubenswrapper[4839]: I0321 04:56:30.931277 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab9d4433-fe0e-471b-84f8-568b31920ed3-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:30 crc kubenswrapper[4839]: I0321 04:56:30.931290 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blxs6\" (UniqueName: \"kubernetes.io/projected/ab9d4433-fe0e-471b-84f8-568b31920ed3-kube-api-access-blxs6\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.377308 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" event={"ID":"ab9d4433-fe0e-471b-84f8-568b31920ed3","Type":"ContainerDied","Data":"4d2611a14110b17c867ac61b555f142cd249467327dd49d330816ffb10a58194"} Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.377666 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d2611a14110b17c867ac61b555f142cd249467327dd49d330816ffb10a58194" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.377528 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qkclf" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.469252 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-chfcw"] Mar 21 04:56:31 crc kubenswrapper[4839]: E0321 04:56:31.469730 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9d4433-fe0e-471b-84f8-568b31920ed3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.469758 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9d4433-fe0e-471b-84f8-568b31920ed3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:56:31 crc kubenswrapper[4839]: E0321 04:56:31.469808 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8" containerName="oc" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.469817 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8" containerName="oc" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.470050 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8" containerName="oc" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.470070 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab9d4433-fe0e-471b-84f8-568b31920ed3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.475179 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.477666 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.477681 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.477964 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.480341 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-chfcw"] Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.481449 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.542937 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gf4r\" (UniqueName: \"kubernetes.io/projected/39dbacec-c845-4f19-92a9-c0e63fba203c-kube-api-access-8gf4r\") pod \"ssh-known-hosts-edpm-deployment-chfcw\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.543087 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-chfcw\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.543124 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-chfcw\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.645252 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-chfcw\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.645308 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-chfcw\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.645388 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gf4r\" (UniqueName: \"kubernetes.io/projected/39dbacec-c845-4f19-92a9-c0e63fba203c-kube-api-access-8gf4r\") pod \"ssh-known-hosts-edpm-deployment-chfcw\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.649434 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-chfcw\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.652017 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-chfcw\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.664406 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gf4r\" (UniqueName: \"kubernetes.io/projected/39dbacec-c845-4f19-92a9-c0e63fba203c-kube-api-access-8gf4r\") pod \"ssh-known-hosts-edpm-deployment-chfcw\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:31 crc kubenswrapper[4839]: I0321 04:56:31.802801 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:32 crc kubenswrapper[4839]: I0321 04:56:32.332482 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-chfcw"] Mar 21 04:56:32 crc kubenswrapper[4839]: I0321 04:56:32.386784 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" event={"ID":"39dbacec-c845-4f19-92a9-c0e63fba203c","Type":"ContainerStarted","Data":"45cf77cbb165cdeab83452c0e8e2aa4aea1d85e4c4eb12a0e746b22674d3f296"} Mar 21 04:56:33 crc kubenswrapper[4839]: I0321 04:56:33.400300 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" event={"ID":"39dbacec-c845-4f19-92a9-c0e63fba203c","Type":"ContainerStarted","Data":"6cf9b33fd67e98124ff5ecedbd15da8038542728bbcd98b40d8d307a3bc9485c"} Mar 21 04:56:33 crc kubenswrapper[4839]: I0321 04:56:33.416772 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" podStartSLOduration=1.804084568 podStartE2EDuration="2.416748449s" podCreationTimestamp="2026-03-21 04:56:31 +0000 UTC" firstStartedPulling="2026-03-21 04:56:32.340708122 +0000 UTC m=+1996.668494798" lastFinishedPulling="2026-03-21 04:56:32.953371983 +0000 UTC m=+1997.281158679" observedRunningTime="2026-03-21 04:56:33.413139868 +0000 UTC m=+1997.740926544" watchObservedRunningTime="2026-03-21 04:56:33.416748449 +0000 UTC m=+1997.744535125" Mar 21 04:56:35 crc kubenswrapper[4839]: I0321 04:56:35.453386 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:56:35 crc kubenswrapper[4839]: E0321 04:56:35.454086 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:56:38 crc kubenswrapper[4839]: I0321 04:56:38.040150 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-csj7l"] Mar 21 04:56:38 crc kubenswrapper[4839]: I0321 04:56:38.051972 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-csj7l"] Mar 21 04:56:38 crc kubenswrapper[4839]: I0321 04:56:38.465007 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37c6fbf7-427d-45a8-b190-439265c8d6d0" path="/var/lib/kubelet/pods/37c6fbf7-427d-45a8-b190-439265c8d6d0/volumes" Mar 21 04:56:39 crc kubenswrapper[4839]: I0321 04:56:39.455254 4839 generic.go:334] "Generic (PLEG): container finished" podID="39dbacec-c845-4f19-92a9-c0e63fba203c" containerID="6cf9b33fd67e98124ff5ecedbd15da8038542728bbcd98b40d8d307a3bc9485c" exitCode=0 Mar 21 04:56:39 crc kubenswrapper[4839]: I0321 04:56:39.455334 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" event={"ID":"39dbacec-c845-4f19-92a9-c0e63fba203c","Type":"ContainerDied","Data":"6cf9b33fd67e98124ff5ecedbd15da8038542728bbcd98b40d8d307a3bc9485c"} Mar 21 04:56:40 crc kubenswrapper[4839]: I0321 04:56:40.876782 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:40 crc kubenswrapper[4839]: I0321 04:56:40.945022 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-ssh-key-openstack-edpm-ipam\") pod \"39dbacec-c845-4f19-92a9-c0e63fba203c\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " Mar 21 04:56:40 crc kubenswrapper[4839]: I0321 04:56:40.945291 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-inventory-0\") pod \"39dbacec-c845-4f19-92a9-c0e63fba203c\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " Mar 21 04:56:40 crc kubenswrapper[4839]: I0321 04:56:40.946459 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gf4r\" (UniqueName: \"kubernetes.io/projected/39dbacec-c845-4f19-92a9-c0e63fba203c-kube-api-access-8gf4r\") pod \"39dbacec-c845-4f19-92a9-c0e63fba203c\" (UID: \"39dbacec-c845-4f19-92a9-c0e63fba203c\") " Mar 21 04:56:40 crc kubenswrapper[4839]: I0321 04:56:40.951588 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39dbacec-c845-4f19-92a9-c0e63fba203c-kube-api-access-8gf4r" (OuterVolumeSpecName: "kube-api-access-8gf4r") pod "39dbacec-c845-4f19-92a9-c0e63fba203c" (UID: "39dbacec-c845-4f19-92a9-c0e63fba203c"). InnerVolumeSpecName "kube-api-access-8gf4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:56:40 crc kubenswrapper[4839]: I0321 04:56:40.972950 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "39dbacec-c845-4f19-92a9-c0e63fba203c" (UID: "39dbacec-c845-4f19-92a9-c0e63fba203c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:56:40 crc kubenswrapper[4839]: I0321 04:56:40.973172 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "39dbacec-c845-4f19-92a9-c0e63fba203c" (UID: "39dbacec-c845-4f19-92a9-c0e63fba203c"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.048986 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.049482 4839 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/39dbacec-c845-4f19-92a9-c0e63fba203c-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.049554 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gf4r\" (UniqueName: \"kubernetes.io/projected/39dbacec-c845-4f19-92a9-c0e63fba203c-kube-api-access-8gf4r\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.473192 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" event={"ID":"39dbacec-c845-4f19-92a9-c0e63fba203c","Type":"ContainerDied","Data":"45cf77cbb165cdeab83452c0e8e2aa4aea1d85e4c4eb12a0e746b22674d3f296"} Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.473244 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45cf77cbb165cdeab83452c0e8e2aa4aea1d85e4c4eb12a0e746b22674d3f296" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.473261 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-chfcw" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.550679 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl"] Mar 21 04:56:41 crc kubenswrapper[4839]: E0321 04:56:41.551091 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39dbacec-c845-4f19-92a9-c0e63fba203c" containerName="ssh-known-hosts-edpm-deployment" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.551114 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="39dbacec-c845-4f19-92a9-c0e63fba203c" containerName="ssh-known-hosts-edpm-deployment" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.551289 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="39dbacec-c845-4f19-92a9-c0e63fba203c" containerName="ssh-known-hosts-edpm-deployment" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.551933 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.558829 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.559132 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.559328 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.559441 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.570482 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl"] Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.659378 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c444z\" (UniqueName: \"kubernetes.io/projected/26adbd7b-7994-4bea-9f94-338881339833-kube-api-access-c444z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55fzl\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.659509 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55fzl\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.659585 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55fzl\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.761085 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55fzl\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.761192 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55fzl\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.761305 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c444z\" (UniqueName: \"kubernetes.io/projected/26adbd7b-7994-4bea-9f94-338881339833-kube-api-access-c444z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55fzl\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.765332 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55fzl\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.766757 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55fzl\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.783894 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c444z\" (UniqueName: \"kubernetes.io/projected/26adbd7b-7994-4bea-9f94-338881339833-kube-api-access-c444z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-55fzl\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:41 crc kubenswrapper[4839]: I0321 04:56:41.871365 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:42 crc kubenswrapper[4839]: I0321 04:56:42.038928 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jznl6"] Mar 21 04:56:42 crc kubenswrapper[4839]: I0321 04:56:42.048818 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jznl6"] Mar 21 04:56:42 crc kubenswrapper[4839]: I0321 04:56:42.426470 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl"] Mar 21 04:56:42 crc kubenswrapper[4839]: I0321 04:56:42.468120 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="500decd4-2b92-4e52-bfa8-bb8d1fe13b9d" path="/var/lib/kubelet/pods/500decd4-2b92-4e52-bfa8-bb8d1fe13b9d/volumes" Mar 21 04:56:42 crc kubenswrapper[4839]: I0321 04:56:42.482146 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" event={"ID":"26adbd7b-7994-4bea-9f94-338881339833","Type":"ContainerStarted","Data":"8f682b0989285d8c00029eab5269a32b2effbb8bd30193f56340e53ee018216a"} Mar 21 04:56:43 crc kubenswrapper[4839]: I0321 04:56:43.491969 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" event={"ID":"26adbd7b-7994-4bea-9f94-338881339833","Type":"ContainerStarted","Data":"57c38841df5f83fb1292c8c5338647bdc36120ad9d503b33d4421151aaab3f6e"} Mar 21 04:56:43 crc kubenswrapper[4839]: I0321 04:56:43.516235 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" podStartSLOduration=2.118632058 podStartE2EDuration="2.516220181s" podCreationTimestamp="2026-03-21 04:56:41 +0000 UTC" firstStartedPulling="2026-03-21 04:56:42.446374418 +0000 UTC m=+2006.774161094" lastFinishedPulling="2026-03-21 04:56:42.843962551 +0000 UTC m=+2007.171749217" observedRunningTime="2026-03-21 04:56:43.511945311 +0000 UTC m=+2007.839731997" watchObservedRunningTime="2026-03-21 04:56:43.516220181 +0000 UTC m=+2007.844006857" Mar 21 04:56:47 crc kubenswrapper[4839]: I0321 04:56:47.453086 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:56:47 crc kubenswrapper[4839]: E0321 04:56:47.453853 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:56:50 crc kubenswrapper[4839]: E0321 04:56:50.340513 4839 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26adbd7b_7994_4bea_9f94_338881339833.slice/crio-57c38841df5f83fb1292c8c5338647bdc36120ad9d503b33d4421151aaab3f6e.scope\": RecentStats: unable to find data in memory cache]" Mar 21 04:56:50 crc kubenswrapper[4839]: I0321 04:56:50.560072 4839 generic.go:334] "Generic (PLEG): container finished" podID="26adbd7b-7994-4bea-9f94-338881339833" containerID="57c38841df5f83fb1292c8c5338647bdc36120ad9d503b33d4421151aaab3f6e" exitCode=0 Mar 21 04:56:50 crc kubenswrapper[4839]: I0321 04:56:50.560123 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" event={"ID":"26adbd7b-7994-4bea-9f94-338881339833","Type":"ContainerDied","Data":"57c38841df5f83fb1292c8c5338647bdc36120ad9d503b33d4421151aaab3f6e"} Mar 21 04:56:51 crc kubenswrapper[4839]: I0321 04:56:51.996408 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.054531 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-ssh-key-openstack-edpm-ipam\") pod \"26adbd7b-7994-4bea-9f94-338881339833\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.054666 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-inventory\") pod \"26adbd7b-7994-4bea-9f94-338881339833\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.054709 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c444z\" (UniqueName: \"kubernetes.io/projected/26adbd7b-7994-4bea-9f94-338881339833-kube-api-access-c444z\") pod \"26adbd7b-7994-4bea-9f94-338881339833\" (UID: \"26adbd7b-7994-4bea-9f94-338881339833\") " Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.059992 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26adbd7b-7994-4bea-9f94-338881339833-kube-api-access-c444z" (OuterVolumeSpecName: "kube-api-access-c444z") pod "26adbd7b-7994-4bea-9f94-338881339833" (UID: "26adbd7b-7994-4bea-9f94-338881339833"). InnerVolumeSpecName "kube-api-access-c444z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.085977 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-inventory" (OuterVolumeSpecName: "inventory") pod "26adbd7b-7994-4bea-9f94-338881339833" (UID: "26adbd7b-7994-4bea-9f94-338881339833"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.086076 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "26adbd7b-7994-4bea-9f94-338881339833" (UID: "26adbd7b-7994-4bea-9f94-338881339833"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.157772 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.157824 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/26adbd7b-7994-4bea-9f94-338881339833-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.157837 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c444z\" (UniqueName: \"kubernetes.io/projected/26adbd7b-7994-4bea-9f94-338881339833-kube-api-access-c444z\") on node \"crc\" DevicePath \"\"" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.579076 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" event={"ID":"26adbd7b-7994-4bea-9f94-338881339833","Type":"ContainerDied","Data":"8f682b0989285d8c00029eab5269a32b2effbb8bd30193f56340e53ee018216a"} Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.579114 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f682b0989285d8c00029eab5269a32b2effbb8bd30193f56340e53ee018216a" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.579135 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-55fzl" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.639795 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r"] Mar 21 04:56:52 crc kubenswrapper[4839]: E0321 04:56:52.640180 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26adbd7b-7994-4bea-9f94-338881339833" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.640200 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="26adbd7b-7994-4bea-9f94-338881339833" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.640415 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="26adbd7b-7994-4bea-9f94-338881339833" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.641179 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.644433 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.644434 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.645294 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.645770 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.655722 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r"] Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.666423 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.666484 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgthl\" (UniqueName: \"kubernetes.io/projected/66c3e343-3306-455d-89d7-db17c1bd53ed-kube-api-access-tgthl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.666551 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.768209 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.768345 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.768386 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgthl\" (UniqueName: \"kubernetes.io/projected/66c3e343-3306-455d-89d7-db17c1bd53ed-kube-api-access-tgthl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.771737 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.772234 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.792867 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgthl\" (UniqueName: \"kubernetes.io/projected/66c3e343-3306-455d-89d7-db17c1bd53ed-kube-api-access-tgthl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:52 crc kubenswrapper[4839]: I0321 04:56:52.957073 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:56:53 crc kubenswrapper[4839]: I0321 04:56:53.453335 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r"] Mar 21 04:56:53 crc kubenswrapper[4839]: W0321 04:56:53.456482 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66c3e343_3306_455d_89d7_db17c1bd53ed.slice/crio-b38083ec341a913ae40633cd241eb560533502df4320d1402330cb006f7cab05 WatchSource:0}: Error finding container b38083ec341a913ae40633cd241eb560533502df4320d1402330cb006f7cab05: Status 404 returned error can't find the container with id b38083ec341a913ae40633cd241eb560533502df4320d1402330cb006f7cab05 Mar 21 04:56:53 crc kubenswrapper[4839]: I0321 04:56:53.587784 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" event={"ID":"66c3e343-3306-455d-89d7-db17c1bd53ed","Type":"ContainerStarted","Data":"b38083ec341a913ae40633cd241eb560533502df4320d1402330cb006f7cab05"} Mar 21 04:56:54 crc kubenswrapper[4839]: I0321 04:56:54.596591 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" event={"ID":"66c3e343-3306-455d-89d7-db17c1bd53ed","Type":"ContainerStarted","Data":"9221954f588154e2a1711698f6e316dc7635cd9779ff5b27684643d387bfa5bc"} Mar 21 04:56:54 crc kubenswrapper[4839]: I0321 04:56:54.618294 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" podStartSLOduration=2.162687536 podStartE2EDuration="2.618274222s" podCreationTimestamp="2026-03-21 04:56:52 +0000 UTC" firstStartedPulling="2026-03-21 04:56:53.458835825 +0000 UTC m=+2017.786622501" lastFinishedPulling="2026-03-21 04:56:53.914422511 +0000 UTC m=+2018.242209187" observedRunningTime="2026-03-21 04:56:54.610990067 +0000 UTC m=+2018.938776733" watchObservedRunningTime="2026-03-21 04:56:54.618274222 +0000 UTC m=+2018.946060898" Mar 21 04:56:55 crc kubenswrapper[4839]: I0321 04:56:55.176434 4839 scope.go:117] "RemoveContainer" containerID="3f39162e6963343de8c3eafe8a89ac888be7f9493499afd89bf8375748fc8e0f" Mar 21 04:56:55 crc kubenswrapper[4839]: I0321 04:56:55.220070 4839 scope.go:117] "RemoveContainer" containerID="6500e5c41c0724032a37daabaaadca5a2ab96ab0732aaceeaaccdf5e739d902c" Mar 21 04:56:55 crc kubenswrapper[4839]: I0321 04:56:55.263033 4839 scope.go:117] "RemoveContainer" containerID="118f2c293ce181a9defa7eb0621b40d7a4ec32e8ea91c36b0f98ccebfdd6ba13" Mar 21 04:56:55 crc kubenswrapper[4839]: I0321 04:56:55.310865 4839 scope.go:117] "RemoveContainer" containerID="a57d3dec4c234a21b088b3986b8d9a4b8012dec53cc26619ad9bdd0f9475d8cc" Mar 21 04:56:58 crc kubenswrapper[4839]: I0321 04:56:58.453867 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:56:58 crc kubenswrapper[4839]: E0321 04:56:58.454647 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:57:03 crc kubenswrapper[4839]: I0321 04:57:03.663338 4839 generic.go:334] "Generic (PLEG): container finished" podID="66c3e343-3306-455d-89d7-db17c1bd53ed" containerID="9221954f588154e2a1711698f6e316dc7635cd9779ff5b27684643d387bfa5bc" exitCode=0 Mar 21 04:57:03 crc kubenswrapper[4839]: I0321 04:57:03.663426 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" event={"ID":"66c3e343-3306-455d-89d7-db17c1bd53ed","Type":"ContainerDied","Data":"9221954f588154e2a1711698f6e316dc7635cd9779ff5b27684643d387bfa5bc"} Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.083999 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.190459 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-ssh-key-openstack-edpm-ipam\") pod \"66c3e343-3306-455d-89d7-db17c1bd53ed\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.190518 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-inventory\") pod \"66c3e343-3306-455d-89d7-db17c1bd53ed\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.190614 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgthl\" (UniqueName: \"kubernetes.io/projected/66c3e343-3306-455d-89d7-db17c1bd53ed-kube-api-access-tgthl\") pod \"66c3e343-3306-455d-89d7-db17c1bd53ed\" (UID: \"66c3e343-3306-455d-89d7-db17c1bd53ed\") " Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.198389 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c3e343-3306-455d-89d7-db17c1bd53ed-kube-api-access-tgthl" (OuterVolumeSpecName: "kube-api-access-tgthl") pod "66c3e343-3306-455d-89d7-db17c1bd53ed" (UID: "66c3e343-3306-455d-89d7-db17c1bd53ed"). InnerVolumeSpecName "kube-api-access-tgthl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.216137 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-inventory" (OuterVolumeSpecName: "inventory") pod "66c3e343-3306-455d-89d7-db17c1bd53ed" (UID: "66c3e343-3306-455d-89d7-db17c1bd53ed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.218089 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "66c3e343-3306-455d-89d7-db17c1bd53ed" (UID: "66c3e343-3306-455d-89d7-db17c1bd53ed"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.293170 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.293200 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66c3e343-3306-455d-89d7-db17c1bd53ed-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.293209 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgthl\" (UniqueName: \"kubernetes.io/projected/66c3e343-3306-455d-89d7-db17c1bd53ed-kube-api-access-tgthl\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.683470 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" event={"ID":"66c3e343-3306-455d-89d7-db17c1bd53ed","Type":"ContainerDied","Data":"b38083ec341a913ae40633cd241eb560533502df4320d1402330cb006f7cab05"} Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.683518 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b38083ec341a913ae40633cd241eb560533502df4320d1402330cb006f7cab05" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.683536 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.763584 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7"] Mar 21 04:57:05 crc kubenswrapper[4839]: E0321 04:57:05.764016 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c3e343-3306-455d-89d7-db17c1bd53ed" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.764038 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c3e343-3306-455d-89d7-db17c1bd53ed" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.764307 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="66c3e343-3306-455d-89d7-db17c1bd53ed" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.765083 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.770748 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.770802 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.770749 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.770908 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.771038 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.771146 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.772646 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.772882 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.778697 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7"] Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.902901 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.902981 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903034 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903244 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903315 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903388 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903491 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903551 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903706 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903756 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44gls\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-kube-api-access-44gls\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903837 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903897 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.903956 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:05 crc kubenswrapper[4839]: I0321 04:57:05.904040 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006076 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006128 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006152 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006190 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006212 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006276 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006308 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44gls\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-kube-api-access-44gls\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006346 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006396 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006429 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006475 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006511 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006540 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.006596 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.010666 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.011122 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.011956 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.011968 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.013179 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.013212 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.013271 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.013350 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.013700 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.014362 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.014469 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.014852 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.019133 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.024604 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44gls\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-kube-api-access-44gls\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.083900 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.585605 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7"] Mar 21 04:57:06 crc kubenswrapper[4839]: I0321 04:57:06.693223 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" event={"ID":"268d87b5-57ec-49ff-be62-fe59e6b4b819","Type":"ContainerStarted","Data":"6e8b398cf3e59e505ca93e4e71d85f466e23b3528386cc1cb498ab7fcfbcfaeb"} Mar 21 04:57:07 crc kubenswrapper[4839]: I0321 04:57:07.700849 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" event={"ID":"268d87b5-57ec-49ff-be62-fe59e6b4b819","Type":"ContainerStarted","Data":"25ab4e1f891bddc34d799aea59670348f47ccbc48d537f5147c67b449b3ea5a6"} Mar 21 04:57:07 crc kubenswrapper[4839]: I0321 04:57:07.753386 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" podStartSLOduration=2.311778335 podStartE2EDuration="2.753359517s" podCreationTimestamp="2026-03-21 04:57:05 +0000 UTC" firstStartedPulling="2026-03-21 04:57:06.585627395 +0000 UTC m=+2030.913414081" lastFinishedPulling="2026-03-21 04:57:07.027208587 +0000 UTC m=+2031.354995263" observedRunningTime="2026-03-21 04:57:07.738062146 +0000 UTC m=+2032.065848822" watchObservedRunningTime="2026-03-21 04:57:07.753359517 +0000 UTC m=+2032.081146193" Mar 21 04:57:09 crc kubenswrapper[4839]: I0321 04:57:09.453168 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:57:09 crc kubenswrapper[4839]: E0321 04:57:09.453462 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:57:21 crc kubenswrapper[4839]: I0321 04:57:21.453554 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:57:21 crc kubenswrapper[4839]: E0321 04:57:21.454466 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:57:23 crc kubenswrapper[4839]: I0321 04:57:23.038785 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-f7kjm"] Mar 21 04:57:23 crc kubenswrapper[4839]: I0321 04:57:23.051372 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-f7kjm"] Mar 21 04:57:24 crc kubenswrapper[4839]: I0321 04:57:24.463014 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c8778a4-d8b7-4331-be57-d1844b3c0f9f" path="/var/lib/kubelet/pods/6c8778a4-d8b7-4331-be57-d1844b3c0f9f/volumes" Mar 21 04:57:35 crc kubenswrapper[4839]: I0321 04:57:35.452734 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:57:35 crc kubenswrapper[4839]: E0321 04:57:35.453530 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:57:40 crc kubenswrapper[4839]: I0321 04:57:40.963843 4839 generic.go:334] "Generic (PLEG): container finished" podID="268d87b5-57ec-49ff-be62-fe59e6b4b819" containerID="25ab4e1f891bddc34d799aea59670348f47ccbc48d537f5147c67b449b3ea5a6" exitCode=0 Mar 21 04:57:40 crc kubenswrapper[4839]: I0321 04:57:40.963946 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" event={"ID":"268d87b5-57ec-49ff-be62-fe59e6b4b819","Type":"ContainerDied","Data":"25ab4e1f891bddc34d799aea59670348f47ccbc48d537f5147c67b449b3ea5a6"} Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.379914 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.502102 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-neutron-metadata-combined-ca-bundle\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.502184 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-libvirt-combined-ca-bundle\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.502308 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.502350 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ssh-key-openstack-edpm-ipam\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.503201 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-inventory\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.503237 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.503294 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-bootstrap-combined-ca-bundle\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.503338 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.503472 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-telemetry-combined-ca-bundle\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.503537 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-ovn-default-certs-0\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.503603 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-repo-setup-combined-ca-bundle\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.503639 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ovn-combined-ca-bundle\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.503709 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-nova-combined-ca-bundle\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.503735 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44gls\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-kube-api-access-44gls\") pod \"268d87b5-57ec-49ff-be62-fe59e6b4b819\" (UID: \"268d87b5-57ec-49ff-be62-fe59e6b4b819\") " Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.508700 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.508985 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.509009 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.509045 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.509096 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.509115 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.511133 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.511155 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.511162 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-kube-api-access-44gls" (OuterVolumeSpecName: "kube-api-access-44gls") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "kube-api-access-44gls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.511184 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.511239 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.512169 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.530835 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-inventory" (OuterVolumeSpecName: "inventory") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.536536 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "268d87b5-57ec-49ff-be62-fe59e6b4b819" (UID: "268d87b5-57ec-49ff-be62-fe59e6b4b819"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.606929 4839 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.606964 4839 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.606977 4839 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.606986 4839 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.606995 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44gls\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-kube-api-access-44gls\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.607004 4839 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.607014 4839 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.607024 4839 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.607035 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.607045 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.607057 4839 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.607069 4839 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.607081 4839 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/268d87b5-57ec-49ff-be62-fe59e6b4b819-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.607095 4839 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/268d87b5-57ec-49ff-be62-fe59e6b4b819-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.981891 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" event={"ID":"268d87b5-57ec-49ff-be62-fe59e6b4b819","Type":"ContainerDied","Data":"6e8b398cf3e59e505ca93e4e71d85f466e23b3528386cc1cb498ab7fcfbcfaeb"} Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.981937 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e8b398cf3e59e505ca93e4e71d85f466e23b3528386cc1cb498ab7fcfbcfaeb" Mar 21 04:57:42 crc kubenswrapper[4839]: I0321 04:57:42.981940 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.128902 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq"] Mar 21 04:57:43 crc kubenswrapper[4839]: E0321 04:57:43.129286 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="268d87b5-57ec-49ff-be62-fe59e6b4b819" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.129302 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="268d87b5-57ec-49ff-be62-fe59e6b4b819" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.129467 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="268d87b5-57ec-49ff-be62-fe59e6b4b819" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.130022 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.132079 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.134889 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.135065 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.135201 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.138923 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.157951 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq"] Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.217357 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.217494 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.217576 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.217722 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.217758 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp4jm\" (UniqueName: \"kubernetes.io/projected/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-kube-api-access-tp4jm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.319411 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.319467 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp4jm\" (UniqueName: \"kubernetes.io/projected/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-kube-api-access-tp4jm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.319596 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.320051 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.320099 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.321036 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.323293 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.323503 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.329118 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.337213 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp4jm\" (UniqueName: \"kubernetes.io/projected/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-kube-api-access-tp4jm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v4wqq\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.461589 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:57:43 crc kubenswrapper[4839]: I0321 04:57:43.988466 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq"] Mar 21 04:57:44 crc kubenswrapper[4839]: I0321 04:57:44.999404 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" event={"ID":"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd","Type":"ContainerStarted","Data":"90bc916bb321224b514616b34e453eee1cee1631314ee728c0c789b978bf6856"} Mar 21 04:57:46 crc kubenswrapper[4839]: I0321 04:57:46.009649 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" event={"ID":"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd","Type":"ContainerStarted","Data":"eaaf1096c895c8ef87987d2b9b75baeb732a85617c19cf6724ad330b3a1d7d4a"} Mar 21 04:57:46 crc kubenswrapper[4839]: I0321 04:57:46.038972 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" podStartSLOduration=2.27872117 podStartE2EDuration="3.03895032s" podCreationTimestamp="2026-03-21 04:57:43 +0000 UTC" firstStartedPulling="2026-03-21 04:57:44.000574567 +0000 UTC m=+2068.328361243" lastFinishedPulling="2026-03-21 04:57:44.760803717 +0000 UTC m=+2069.088590393" observedRunningTime="2026-03-21 04:57:46.025941903 +0000 UTC m=+2070.353728599" watchObservedRunningTime="2026-03-21 04:57:46.03895032 +0000 UTC m=+2070.366737006" Mar 21 04:57:50 crc kubenswrapper[4839]: I0321 04:57:50.453274 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:57:50 crc kubenswrapper[4839]: E0321 04:57:50.453977 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 04:57:55 crc kubenswrapper[4839]: I0321 04:57:55.437298 4839 scope.go:117] "RemoveContainer" containerID="07f2e48c7301d0027bae700357d24a79a9ba9d36dd4d10cd8158d308e2f8bf3d" Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.158275 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567818-qzz8l"] Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.160383 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567818-qzz8l" Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.163434 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.163611 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.163614 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.168186 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567818-qzz8l"] Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.228228 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw85t\" (UniqueName: \"kubernetes.io/projected/04b644e0-9d17-491d-be8c-359dd9f82604-kube-api-access-rw85t\") pod \"auto-csr-approver-29567818-qzz8l\" (UID: \"04b644e0-9d17-491d-be8c-359dd9f82604\") " pod="openshift-infra/auto-csr-approver-29567818-qzz8l" Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.329471 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw85t\" (UniqueName: \"kubernetes.io/projected/04b644e0-9d17-491d-be8c-359dd9f82604-kube-api-access-rw85t\") pod \"auto-csr-approver-29567818-qzz8l\" (UID: \"04b644e0-9d17-491d-be8c-359dd9f82604\") " pod="openshift-infra/auto-csr-approver-29567818-qzz8l" Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.347240 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw85t\" (UniqueName: \"kubernetes.io/projected/04b644e0-9d17-491d-be8c-359dd9f82604-kube-api-access-rw85t\") pod \"auto-csr-approver-29567818-qzz8l\" (UID: \"04b644e0-9d17-491d-be8c-359dd9f82604\") " pod="openshift-infra/auto-csr-approver-29567818-qzz8l" Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.490151 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567818-qzz8l" Mar 21 04:58:00 crc kubenswrapper[4839]: I0321 04:58:00.920825 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567818-qzz8l"] Mar 21 04:58:01 crc kubenswrapper[4839]: I0321 04:58:01.139776 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567818-qzz8l" event={"ID":"04b644e0-9d17-491d-be8c-359dd9f82604","Type":"ContainerStarted","Data":"e3a17cb1049773d20eadab08a7388a8d7ad767aaa4fd1b4d999e0094525c5ba2"} Mar 21 04:58:02 crc kubenswrapper[4839]: I0321 04:58:02.153666 4839 generic.go:334] "Generic (PLEG): container finished" podID="04b644e0-9d17-491d-be8c-359dd9f82604" containerID="c879183e5f723b0bd5065e25afca82cc281c19704902af0285103b69c58011ac" exitCode=0 Mar 21 04:58:02 crc kubenswrapper[4839]: I0321 04:58:02.153772 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567818-qzz8l" event={"ID":"04b644e0-9d17-491d-be8c-359dd9f82604","Type":"ContainerDied","Data":"c879183e5f723b0bd5065e25afca82cc281c19704902af0285103b69c58011ac"} Mar 21 04:58:02 crc kubenswrapper[4839]: I0321 04:58:02.453724 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 04:58:03 crc kubenswrapper[4839]: I0321 04:58:03.165400 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"f3d8592746d13c0ed95f298f8a3279e3766ac0141ca420e1630c7b54039959ed"} Mar 21 04:58:03 crc kubenswrapper[4839]: I0321 04:58:03.516904 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567818-qzz8l" Mar 21 04:58:03 crc kubenswrapper[4839]: I0321 04:58:03.599459 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw85t\" (UniqueName: \"kubernetes.io/projected/04b644e0-9d17-491d-be8c-359dd9f82604-kube-api-access-rw85t\") pod \"04b644e0-9d17-491d-be8c-359dd9f82604\" (UID: \"04b644e0-9d17-491d-be8c-359dd9f82604\") " Mar 21 04:58:03 crc kubenswrapper[4839]: I0321 04:58:03.605522 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b644e0-9d17-491d-be8c-359dd9f82604-kube-api-access-rw85t" (OuterVolumeSpecName: "kube-api-access-rw85t") pod "04b644e0-9d17-491d-be8c-359dd9f82604" (UID: "04b644e0-9d17-491d-be8c-359dd9f82604"). InnerVolumeSpecName "kube-api-access-rw85t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:58:03 crc kubenswrapper[4839]: I0321 04:58:03.702083 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw85t\" (UniqueName: \"kubernetes.io/projected/04b644e0-9d17-491d-be8c-359dd9f82604-kube-api-access-rw85t\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:04 crc kubenswrapper[4839]: I0321 04:58:04.174488 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567818-qzz8l" event={"ID":"04b644e0-9d17-491d-be8c-359dd9f82604","Type":"ContainerDied","Data":"e3a17cb1049773d20eadab08a7388a8d7ad767aaa4fd1b4d999e0094525c5ba2"} Mar 21 04:58:04 crc kubenswrapper[4839]: I0321 04:58:04.174781 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3a17cb1049773d20eadab08a7388a8d7ad767aaa4fd1b4d999e0094525c5ba2" Mar 21 04:58:04 crc kubenswrapper[4839]: I0321 04:58:04.174528 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567818-qzz8l" Mar 21 04:58:04 crc kubenswrapper[4839]: I0321 04:58:04.587226 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567812-jglhv"] Mar 21 04:58:04 crc kubenswrapper[4839]: I0321 04:58:04.595621 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567812-jglhv"] Mar 21 04:58:06 crc kubenswrapper[4839]: I0321 04:58:06.484237 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d13124dd-cca5-49f6-9638-2cb42ed2bb34" path="/var/lib/kubelet/pods/d13124dd-cca5-49f6-9638-2cb42ed2bb34/volumes" Mar 21 04:58:40 crc kubenswrapper[4839]: I0321 04:58:40.514022 4839 generic.go:334] "Generic (PLEG): container finished" podID="7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd" containerID="eaaf1096c895c8ef87987d2b9b75baeb732a85617c19cf6724ad330b3a1d7d4a" exitCode=0 Mar 21 04:58:40 crc kubenswrapper[4839]: I0321 04:58:40.514131 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" event={"ID":"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd","Type":"ContainerDied","Data":"eaaf1096c895c8ef87987d2b9b75baeb732a85617c19cf6724ad330b3a1d7d4a"} Mar 21 04:58:41 crc kubenswrapper[4839]: I0321 04:58:41.944705 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.097192 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovncontroller-config-0\") pod \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.097615 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ssh-key-openstack-edpm-ipam\") pod \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.097656 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp4jm\" (UniqueName: \"kubernetes.io/projected/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-kube-api-access-tp4jm\") pod \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.097790 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovn-combined-ca-bundle\") pod \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.097849 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-inventory\") pod \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\" (UID: \"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd\") " Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.102725 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd" (UID: "7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.106805 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-kube-api-access-tp4jm" (OuterVolumeSpecName: "kube-api-access-tp4jm") pod "7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd" (UID: "7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd"). InnerVolumeSpecName "kube-api-access-tp4jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.122775 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd" (UID: "7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.123611 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd" (UID: "7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.125795 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-inventory" (OuterVolumeSpecName: "inventory") pod "7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd" (UID: "7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.199735 4839 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.199775 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.199786 4839 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.199795 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.199803 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp4jm\" (UniqueName: \"kubernetes.io/projected/7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd-kube-api-access-tp4jm\") on node \"crc\" DevicePath \"\"" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.546218 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" event={"ID":"7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd","Type":"ContainerDied","Data":"90bc916bb321224b514616b34e453eee1cee1631314ee728c0c789b978bf6856"} Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.546261 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90bc916bb321224b514616b34e453eee1cee1631314ee728c0c789b978bf6856" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.546318 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v4wqq" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.766624 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d"] Mar 21 04:58:42 crc kubenswrapper[4839]: E0321 04:58:42.767446 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.767545 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 21 04:58:42 crc kubenswrapper[4839]: E0321 04:58:42.767658 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b644e0-9d17-491d-be8c-359dd9f82604" containerName="oc" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.767732 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b644e0-9d17-491d-be8c-359dd9f82604" containerName="oc" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.768050 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.768168 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b644e0-9d17-491d-be8c-359dd9f82604" containerName="oc" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.769042 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.773079 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.773262 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.773317 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.773641 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.773889 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.773966 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.787378 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d"] Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.920109 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.920168 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.920191 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.920220 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.920282 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:42 crc kubenswrapper[4839]: I0321 04:58:42.920315 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfz7c\" (UniqueName: \"kubernetes.io/projected/ceef8f42-5d77-44c1-ac39-edf0080f68e0-kube-api-access-zfz7c\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.021332 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.022081 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.022173 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.022343 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.022467 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.022558 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfz7c\" (UniqueName: \"kubernetes.io/projected/ceef8f42-5d77-44c1-ac39-edf0080f68e0-kube-api-access-zfz7c\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.026432 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.026513 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.026909 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.027953 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.029517 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.040655 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfz7c\" (UniqueName: \"kubernetes.io/projected/ceef8f42-5d77-44c1-ac39-edf0080f68e0-kube-api-access-zfz7c\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.089511 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.491304 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d"] Mar 21 04:58:43 crc kubenswrapper[4839]: I0321 04:58:43.555295 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" event={"ID":"ceef8f42-5d77-44c1-ac39-edf0080f68e0","Type":"ContainerStarted","Data":"4ee715bddff07c76544b702e5eaee41ba3d3c365ea6d9c4b6f3ab84190c734f8"} Mar 21 04:58:44 crc kubenswrapper[4839]: I0321 04:58:44.566684 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" event={"ID":"ceef8f42-5d77-44c1-ac39-edf0080f68e0","Type":"ContainerStarted","Data":"6bbd6cad4a6706fb2dbeb55ea678fa13e80d6192c348063b3888476d7157223a"} Mar 21 04:58:44 crc kubenswrapper[4839]: I0321 04:58:44.588638 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" podStartSLOduration=1.932611635 podStartE2EDuration="2.588621228s" podCreationTimestamp="2026-03-21 04:58:42 +0000 UTC" firstStartedPulling="2026-03-21 04:58:43.497915117 +0000 UTC m=+2127.825701813" lastFinishedPulling="2026-03-21 04:58:44.15392473 +0000 UTC m=+2128.481711406" observedRunningTime="2026-03-21 04:58:44.588072023 +0000 UTC m=+2128.915858709" watchObservedRunningTime="2026-03-21 04:58:44.588621228 +0000 UTC m=+2128.916407904" Mar 21 04:58:55 crc kubenswrapper[4839]: I0321 04:58:55.507046 4839 scope.go:117] "RemoveContainer" containerID="5189f213ccdcf6760a09eb930ee4482a2d44b649489c1422b2b1e4b3849ef663" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.142034 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gsh99"] Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.145723 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.157302 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gsh99"] Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.181073 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-catalog-content\") pod \"redhat-operators-gsh99\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.181162 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hrrk\" (UniqueName: \"kubernetes.io/projected/21140940-0075-4e70-915c-e37382cc0dd8-kube-api-access-4hrrk\") pod \"redhat-operators-gsh99\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.181252 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-utilities\") pod \"redhat-operators-gsh99\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.283211 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hrrk\" (UniqueName: \"kubernetes.io/projected/21140940-0075-4e70-915c-e37382cc0dd8-kube-api-access-4hrrk\") pod \"redhat-operators-gsh99\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.283316 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-utilities\") pod \"redhat-operators-gsh99\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.283412 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-catalog-content\") pod \"redhat-operators-gsh99\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.283872 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-catalog-content\") pod \"redhat-operators-gsh99\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.283971 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-utilities\") pod \"redhat-operators-gsh99\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.308886 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hrrk\" (UniqueName: \"kubernetes.io/projected/21140940-0075-4e70-915c-e37382cc0dd8-kube-api-access-4hrrk\") pod \"redhat-operators-gsh99\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.482332 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:11 crc kubenswrapper[4839]: I0321 04:59:11.957632 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gsh99"] Mar 21 04:59:12 crc kubenswrapper[4839]: I0321 04:59:12.832696 4839 generic.go:334] "Generic (PLEG): container finished" podID="21140940-0075-4e70-915c-e37382cc0dd8" containerID="55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867" exitCode=0 Mar 21 04:59:12 crc kubenswrapper[4839]: I0321 04:59:12.833497 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gsh99" event={"ID":"21140940-0075-4e70-915c-e37382cc0dd8","Type":"ContainerDied","Data":"55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867"} Mar 21 04:59:12 crc kubenswrapper[4839]: I0321 04:59:12.833609 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gsh99" event={"ID":"21140940-0075-4e70-915c-e37382cc0dd8","Type":"ContainerStarted","Data":"208cf34686ee589dee6133b46731ab90df50d08a69e91f3ab54667a8147ae4d9"} Mar 21 04:59:13 crc kubenswrapper[4839]: I0321 04:59:13.843531 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gsh99" event={"ID":"21140940-0075-4e70-915c-e37382cc0dd8","Type":"ContainerStarted","Data":"cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9"} Mar 21 04:59:15 crc kubenswrapper[4839]: I0321 04:59:15.864353 4839 generic.go:334] "Generic (PLEG): container finished" podID="21140940-0075-4e70-915c-e37382cc0dd8" containerID="cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9" exitCode=0 Mar 21 04:59:15 crc kubenswrapper[4839]: I0321 04:59:15.864449 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gsh99" event={"ID":"21140940-0075-4e70-915c-e37382cc0dd8","Type":"ContainerDied","Data":"cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9"} Mar 21 04:59:18 crc kubenswrapper[4839]: I0321 04:59:18.894811 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gsh99" event={"ID":"21140940-0075-4e70-915c-e37382cc0dd8","Type":"ContainerStarted","Data":"6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355"} Mar 21 04:59:19 crc kubenswrapper[4839]: I0321 04:59:19.933786 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gsh99" podStartSLOduration=4.518973304 podStartE2EDuration="8.933766506s" podCreationTimestamp="2026-03-21 04:59:11 +0000 UTC" firstStartedPulling="2026-03-21 04:59:12.83497879 +0000 UTC m=+2157.162765466" lastFinishedPulling="2026-03-21 04:59:17.249771982 +0000 UTC m=+2161.577558668" observedRunningTime="2026-03-21 04:59:19.925546194 +0000 UTC m=+2164.253332880" watchObservedRunningTime="2026-03-21 04:59:19.933766506 +0000 UTC m=+2164.261553202" Mar 21 04:59:21 crc kubenswrapper[4839]: I0321 04:59:21.483341 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:21 crc kubenswrapper[4839]: I0321 04:59:21.485124 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:22 crc kubenswrapper[4839]: I0321 04:59:22.530833 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gsh99" podUID="21140940-0075-4e70-915c-e37382cc0dd8" containerName="registry-server" probeResult="failure" output=< Mar 21 04:59:22 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 04:59:22 crc kubenswrapper[4839]: > Mar 21 04:59:29 crc kubenswrapper[4839]: I0321 04:59:29.995024 4839 generic.go:334] "Generic (PLEG): container finished" podID="ceef8f42-5d77-44c1-ac39-edf0080f68e0" containerID="6bbd6cad4a6706fb2dbeb55ea678fa13e80d6192c348063b3888476d7157223a" exitCode=0 Mar 21 04:59:29 crc kubenswrapper[4839]: I0321 04:59:29.995090 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" event={"ID":"ceef8f42-5d77-44c1-ac39-edf0080f68e0","Type":"ContainerDied","Data":"6bbd6cad4a6706fb2dbeb55ea678fa13e80d6192c348063b3888476d7157223a"} Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.401654 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.520644 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfz7c\" (UniqueName: \"kubernetes.io/projected/ceef8f42-5d77-44c1-ac39-edf0080f68e0-kube-api-access-zfz7c\") pod \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.521008 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-nova-metadata-neutron-config-0\") pod \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.521117 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-ssh-key-openstack-edpm-ipam\") pod \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.522120 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-inventory\") pod \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.522281 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.522349 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-metadata-combined-ca-bundle\") pod \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\" (UID: \"ceef8f42-5d77-44c1-ac39-edf0080f68e0\") " Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.526826 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceef8f42-5d77-44c1-ac39-edf0080f68e0-kube-api-access-zfz7c" (OuterVolumeSpecName: "kube-api-access-zfz7c") pod "ceef8f42-5d77-44c1-ac39-edf0080f68e0" (UID: "ceef8f42-5d77-44c1-ac39-edf0080f68e0"). InnerVolumeSpecName "kube-api-access-zfz7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.533716 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "ceef8f42-5d77-44c1-ac39-edf0080f68e0" (UID: "ceef8f42-5d77-44c1-ac39-edf0080f68e0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.544455 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.551497 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-inventory" (OuterVolumeSpecName: "inventory") pod "ceef8f42-5d77-44c1-ac39-edf0080f68e0" (UID: "ceef8f42-5d77-44c1-ac39-edf0080f68e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.551543 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ceef8f42-5d77-44c1-ac39-edf0080f68e0" (UID: "ceef8f42-5d77-44c1-ac39-edf0080f68e0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.554517 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "ceef8f42-5d77-44c1-ac39-edf0080f68e0" (UID: "ceef8f42-5d77-44c1-ac39-edf0080f68e0"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.581610 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "ceef8f42-5d77-44c1-ac39-edf0080f68e0" (UID: "ceef8f42-5d77-44c1-ac39-edf0080f68e0"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.604360 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.629869 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.629919 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.629936 4839 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.630011 4839 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.630049 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfz7c\" (UniqueName: \"kubernetes.io/projected/ceef8f42-5d77-44c1-ac39-edf0080f68e0-kube-api-access-zfz7c\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.630065 4839 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ceef8f42-5d77-44c1-ac39-edf0080f68e0-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:31 crc kubenswrapper[4839]: I0321 04:59:31.780937 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gsh99"] Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.024989 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.024988 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d" event={"ID":"ceef8f42-5d77-44c1-ac39-edf0080f68e0","Type":"ContainerDied","Data":"4ee715bddff07c76544b702e5eaee41ba3d3c365ea6d9c4b6f3ab84190c734f8"} Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.025096 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ee715bddff07c76544b702e5eaee41ba3d3c365ea6d9c4b6f3ab84190c734f8" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.155723 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6"] Mar 21 04:59:32 crc kubenswrapper[4839]: E0321 04:59:32.156328 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceef8f42-5d77-44c1-ac39-edf0080f68e0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.156364 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceef8f42-5d77-44c1-ac39-edf0080f68e0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.156788 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceef8f42-5d77-44c1-ac39-edf0080f68e0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.158094 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.162540 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.163625 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.164307 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.164481 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.164701 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.177564 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6"] Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.345416 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.345626 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.345687 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.345764 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.345823 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf8mj\" (UniqueName: \"kubernetes.io/projected/2d056acb-0183-4157-a830-fff4cd1dcacf-kube-api-access-pf8mj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.447651 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.447813 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.447862 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.447921 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.447974 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf8mj\" (UniqueName: \"kubernetes.io/projected/2d056acb-0183-4157-a830-fff4cd1dcacf-kube-api-access-pf8mj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.453727 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.454619 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.455058 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.456786 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.475451 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf8mj\" (UniqueName: \"kubernetes.io/projected/2d056acb-0183-4157-a830-fff4cd1dcacf-kube-api-access-pf8mj\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-w48j6\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.479432 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 04:59:32 crc kubenswrapper[4839]: I0321 04:59:32.984288 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6"] Mar 21 04:59:32 crc kubenswrapper[4839]: W0321 04:59:32.984436 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d056acb_0183_4157_a830_fff4cd1dcacf.slice/crio-a47bbc08aef3ecca5f0392206f51820ac7164fd826b5332b0545a1c8dd6795ca WatchSource:0}: Error finding container a47bbc08aef3ecca5f0392206f51820ac7164fd826b5332b0545a1c8dd6795ca: Status 404 returned error can't find the container with id a47bbc08aef3ecca5f0392206f51820ac7164fd826b5332b0545a1c8dd6795ca Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.033680 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" event={"ID":"2d056acb-0183-4157-a830-fff4cd1dcacf","Type":"ContainerStarted","Data":"a47bbc08aef3ecca5f0392206f51820ac7164fd826b5332b0545a1c8dd6795ca"} Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.033828 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gsh99" podUID="21140940-0075-4e70-915c-e37382cc0dd8" containerName="registry-server" containerID="cri-o://6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355" gracePeriod=2 Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.588666 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.772494 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-utilities\") pod \"21140940-0075-4e70-915c-e37382cc0dd8\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.772785 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hrrk\" (UniqueName: \"kubernetes.io/projected/21140940-0075-4e70-915c-e37382cc0dd8-kube-api-access-4hrrk\") pod \"21140940-0075-4e70-915c-e37382cc0dd8\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.772831 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-catalog-content\") pod \"21140940-0075-4e70-915c-e37382cc0dd8\" (UID: \"21140940-0075-4e70-915c-e37382cc0dd8\") " Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.774011 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-utilities" (OuterVolumeSpecName: "utilities") pod "21140940-0075-4e70-915c-e37382cc0dd8" (UID: "21140940-0075-4e70-915c-e37382cc0dd8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.778702 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21140940-0075-4e70-915c-e37382cc0dd8-kube-api-access-4hrrk" (OuterVolumeSpecName: "kube-api-access-4hrrk") pod "21140940-0075-4e70-915c-e37382cc0dd8" (UID: "21140940-0075-4e70-915c-e37382cc0dd8"). InnerVolumeSpecName "kube-api-access-4hrrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.875039 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.875306 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hrrk\" (UniqueName: \"kubernetes.io/projected/21140940-0075-4e70-915c-e37382cc0dd8-kube-api-access-4hrrk\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.935844 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21140940-0075-4e70-915c-e37382cc0dd8" (UID: "21140940-0075-4e70-915c-e37382cc0dd8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 04:59:33 crc kubenswrapper[4839]: I0321 04:59:33.978144 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21140940-0075-4e70-915c-e37382cc0dd8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.043102 4839 generic.go:334] "Generic (PLEG): container finished" podID="21140940-0075-4e70-915c-e37382cc0dd8" containerID="6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355" exitCode=0 Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.043162 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gsh99" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.043167 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gsh99" event={"ID":"21140940-0075-4e70-915c-e37382cc0dd8","Type":"ContainerDied","Data":"6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355"} Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.043306 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gsh99" event={"ID":"21140940-0075-4e70-915c-e37382cc0dd8","Type":"ContainerDied","Data":"208cf34686ee589dee6133b46731ab90df50d08a69e91f3ab54667a8147ae4d9"} Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.043342 4839 scope.go:117] "RemoveContainer" containerID="6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.046474 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" event={"ID":"2d056acb-0183-4157-a830-fff4cd1dcacf","Type":"ContainerStarted","Data":"38923cbb1565ae7e426fbfa5a7cacab2c7dc20c694af3c1982bdf3aeaab3650d"} Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.066882 4839 scope.go:117] "RemoveContainer" containerID="cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.072396 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" podStartSLOduration=1.273464228 podStartE2EDuration="2.072378733s" podCreationTimestamp="2026-03-21 04:59:32 +0000 UTC" firstStartedPulling="2026-03-21 04:59:32.98728634 +0000 UTC m=+2177.315073016" lastFinishedPulling="2026-03-21 04:59:33.786200805 +0000 UTC m=+2178.113987521" observedRunningTime="2026-03-21 04:59:34.067080714 +0000 UTC m=+2178.394867400" watchObservedRunningTime="2026-03-21 04:59:34.072378733 +0000 UTC m=+2178.400165409" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.095191 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gsh99"] Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.101340 4839 scope.go:117] "RemoveContainer" containerID="55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.104594 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gsh99"] Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.125492 4839 scope.go:117] "RemoveContainer" containerID="6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355" Mar 21 04:59:34 crc kubenswrapper[4839]: E0321 04:59:34.125995 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355\": container with ID starting with 6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355 not found: ID does not exist" containerID="6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.126028 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355"} err="failed to get container status \"6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355\": rpc error: code = NotFound desc = could not find container \"6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355\": container with ID starting with 6d9b0f369a1fc13f21c4a26a53ef07fc31884298133c4ccdcab5004c17fdd355 not found: ID does not exist" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.126048 4839 scope.go:117] "RemoveContainer" containerID="cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9" Mar 21 04:59:34 crc kubenswrapper[4839]: E0321 04:59:34.126352 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9\": container with ID starting with cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9 not found: ID does not exist" containerID="cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.126393 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9"} err="failed to get container status \"cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9\": rpc error: code = NotFound desc = could not find container \"cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9\": container with ID starting with cc7df358d00b09b247c1efcf8886b075a20eb00bfbb4f3f671e8a5f170118ec9 not found: ID does not exist" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.126422 4839 scope.go:117] "RemoveContainer" containerID="55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867" Mar 21 04:59:34 crc kubenswrapper[4839]: E0321 04:59:34.126745 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867\": container with ID starting with 55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867 not found: ID does not exist" containerID="55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.126789 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867"} err="failed to get container status \"55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867\": rpc error: code = NotFound desc = could not find container \"55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867\": container with ID starting with 55aece69ea4b938a83677f50b6562a8cb8b2f71c1eedd472b9b4b617c5f69867 not found: ID does not exist" Mar 21 04:59:34 crc kubenswrapper[4839]: E0321 04:59:34.188963 4839 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21140940_0075_4e70_915c_e37382cc0dd8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21140940_0075_4e70_915c_e37382cc0dd8.slice/crio-208cf34686ee589dee6133b46731ab90df50d08a69e91f3ab54667a8147ae4d9\": RecentStats: unable to find data in memory cache]" Mar 21 04:59:34 crc kubenswrapper[4839]: I0321 04:59:34.464445 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21140940-0075-4e70-915c-e37382cc0dd8" path="/var/lib/kubelet/pods/21140940-0075-4e70-915c-e37382cc0dd8/volumes" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.181851 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567820-drjbh"] Mar 21 05:00:00 crc kubenswrapper[4839]: E0321 05:00:00.183261 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21140940-0075-4e70-915c-e37382cc0dd8" containerName="extract-content" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.183290 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="21140940-0075-4e70-915c-e37382cc0dd8" containerName="extract-content" Mar 21 05:00:00 crc kubenswrapper[4839]: E0321 05:00:00.183324 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21140940-0075-4e70-915c-e37382cc0dd8" containerName="extract-utilities" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.183341 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="21140940-0075-4e70-915c-e37382cc0dd8" containerName="extract-utilities" Mar 21 05:00:00 crc kubenswrapper[4839]: E0321 05:00:00.183391 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21140940-0075-4e70-915c-e37382cc0dd8" containerName="registry-server" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.183408 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="21140940-0075-4e70-915c-e37382cc0dd8" containerName="registry-server" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.183875 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="21140940-0075-4e70-915c-e37382cc0dd8" containerName="registry-server" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.185170 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567820-drjbh" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.190477 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.191049 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.191378 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.204768 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf"] Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.206394 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.208478 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567820-drjbh"] Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.209579 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.213165 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.219758 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf"] Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.310242 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-config-volume\") pod \"collect-profiles-29567820-vhdmf\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.310348 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lr8z\" (UniqueName: \"kubernetes.io/projected/2a082320-155d-4eb3-9779-9c6bb4db2b77-kube-api-access-5lr8z\") pod \"auto-csr-approver-29567820-drjbh\" (UID: \"2a082320-155d-4eb3-9779-9c6bb4db2b77\") " pod="openshift-infra/auto-csr-approver-29567820-drjbh" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.311885 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-secret-volume\") pod \"collect-profiles-29567820-vhdmf\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.311968 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkwlg\" (UniqueName: \"kubernetes.io/projected/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-kube-api-access-fkwlg\") pod \"collect-profiles-29567820-vhdmf\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.415754 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkwlg\" (UniqueName: \"kubernetes.io/projected/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-kube-api-access-fkwlg\") pod \"collect-profiles-29567820-vhdmf\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.416030 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-config-volume\") pod \"collect-profiles-29567820-vhdmf\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.416296 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lr8z\" (UniqueName: \"kubernetes.io/projected/2a082320-155d-4eb3-9779-9c6bb4db2b77-kube-api-access-5lr8z\") pod \"auto-csr-approver-29567820-drjbh\" (UID: \"2a082320-155d-4eb3-9779-9c6bb4db2b77\") " pod="openshift-infra/auto-csr-approver-29567820-drjbh" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.416748 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-secret-volume\") pod \"collect-profiles-29567820-vhdmf\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.418331 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-config-volume\") pod \"collect-profiles-29567820-vhdmf\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.428472 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-secret-volume\") pod \"collect-profiles-29567820-vhdmf\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.431167 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkwlg\" (UniqueName: \"kubernetes.io/projected/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-kube-api-access-fkwlg\") pod \"collect-profiles-29567820-vhdmf\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.436541 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lr8z\" (UniqueName: \"kubernetes.io/projected/2a082320-155d-4eb3-9779-9c6bb4db2b77-kube-api-access-5lr8z\") pod \"auto-csr-approver-29567820-drjbh\" (UID: \"2a082320-155d-4eb3-9779-9c6bb4db2b77\") " pod="openshift-infra/auto-csr-approver-29567820-drjbh" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.530159 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567820-drjbh" Mar 21 05:00:00 crc kubenswrapper[4839]: I0321 05:00:00.547421 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:01 crc kubenswrapper[4839]: I0321 05:00:01.013511 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567820-drjbh"] Mar 21 05:00:01 crc kubenswrapper[4839]: W0321 05:00:01.015112 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a082320_155d_4eb3_9779_9c6bb4db2b77.slice/crio-f351d4aa8d34f3a05caa31e580fbae00b07a058f794284e383a28c466298b37b WatchSource:0}: Error finding container f351d4aa8d34f3a05caa31e580fbae00b07a058f794284e383a28c466298b37b: Status 404 returned error can't find the container with id f351d4aa8d34f3a05caa31e580fbae00b07a058f794284e383a28c466298b37b Mar 21 05:00:01 crc kubenswrapper[4839]: I0321 05:00:01.080347 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf"] Mar 21 05:00:01 crc kubenswrapper[4839]: W0321 05:00:01.080356 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2ddd6fb_1042_49a9_a76c_d00f5710a0fd.slice/crio-61268ffe5b7b4cc3afd1ac0ee773199d0f1f59992991e1326fb41d82c6e03302 WatchSource:0}: Error finding container 61268ffe5b7b4cc3afd1ac0ee773199d0f1f59992991e1326fb41d82c6e03302: Status 404 returned error can't find the container with id 61268ffe5b7b4cc3afd1ac0ee773199d0f1f59992991e1326fb41d82c6e03302 Mar 21 05:00:01 crc kubenswrapper[4839]: I0321 05:00:01.316267 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" event={"ID":"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd","Type":"ContainerStarted","Data":"002d83959bbc4835db1d6adcbc064b45d0e38c924ab375cc3c27ef8985bcdd9a"} Mar 21 05:00:01 crc kubenswrapper[4839]: I0321 05:00:01.316580 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" event={"ID":"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd","Type":"ContainerStarted","Data":"61268ffe5b7b4cc3afd1ac0ee773199d0f1f59992991e1326fb41d82c6e03302"} Mar 21 05:00:01 crc kubenswrapper[4839]: I0321 05:00:01.318443 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567820-drjbh" event={"ID":"2a082320-155d-4eb3-9779-9c6bb4db2b77","Type":"ContainerStarted","Data":"f351d4aa8d34f3a05caa31e580fbae00b07a058f794284e383a28c466298b37b"} Mar 21 05:00:01 crc kubenswrapper[4839]: I0321 05:00:01.335728 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" podStartSLOduration=1.335697633 podStartE2EDuration="1.335697633s" podCreationTimestamp="2026-03-21 05:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:00:01.329736985 +0000 UTC m=+2205.657523691" watchObservedRunningTime="2026-03-21 05:00:01.335697633 +0000 UTC m=+2205.663484359" Mar 21 05:00:02 crc kubenswrapper[4839]: I0321 05:00:02.329041 4839 generic.go:334] "Generic (PLEG): container finished" podID="e2ddd6fb-1042-49a9-a76c-d00f5710a0fd" containerID="002d83959bbc4835db1d6adcbc064b45d0e38c924ab375cc3c27ef8985bcdd9a" exitCode=0 Mar 21 05:00:02 crc kubenswrapper[4839]: I0321 05:00:02.329120 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" event={"ID":"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd","Type":"ContainerDied","Data":"002d83959bbc4835db1d6adcbc064b45d0e38c924ab375cc3c27ef8985bcdd9a"} Mar 21 05:00:03 crc kubenswrapper[4839]: I0321 05:00:03.354228 4839 generic.go:334] "Generic (PLEG): container finished" podID="2a082320-155d-4eb3-9779-9c6bb4db2b77" containerID="282bb60b5cd122e380e9afc3be1cd2592f307d95be7c684d73af1ea1a65bb700" exitCode=0 Mar 21 05:00:03 crc kubenswrapper[4839]: I0321 05:00:03.354292 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567820-drjbh" event={"ID":"2a082320-155d-4eb3-9779-9c6bb4db2b77","Type":"ContainerDied","Data":"282bb60b5cd122e380e9afc3be1cd2592f307d95be7c684d73af1ea1a65bb700"} Mar 21 05:00:03 crc kubenswrapper[4839]: I0321 05:00:03.722022 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:03 crc kubenswrapper[4839]: I0321 05:00:03.910845 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-secret-volume\") pod \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " Mar 21 05:00:03 crc kubenswrapper[4839]: I0321 05:00:03.910991 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-config-volume\") pod \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " Mar 21 05:00:03 crc kubenswrapper[4839]: I0321 05:00:03.911963 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-config-volume" (OuterVolumeSpecName: "config-volume") pod "e2ddd6fb-1042-49a9-a76c-d00f5710a0fd" (UID: "e2ddd6fb-1042-49a9-a76c-d00f5710a0fd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:00:03 crc kubenswrapper[4839]: I0321 05:00:03.912430 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkwlg\" (UniqueName: \"kubernetes.io/projected/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-kube-api-access-fkwlg\") pod \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\" (UID: \"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd\") " Mar 21 05:00:03 crc kubenswrapper[4839]: I0321 05:00:03.913369 4839 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:03 crc kubenswrapper[4839]: I0321 05:00:03.916798 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e2ddd6fb-1042-49a9-a76c-d00f5710a0fd" (UID: "e2ddd6fb-1042-49a9-a76c-d00f5710a0fd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:00:03 crc kubenswrapper[4839]: I0321 05:00:03.918553 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-kube-api-access-fkwlg" (OuterVolumeSpecName: "kube-api-access-fkwlg") pod "e2ddd6fb-1042-49a9-a76c-d00f5710a0fd" (UID: "e2ddd6fb-1042-49a9-a76c-d00f5710a0fd"). InnerVolumeSpecName "kube-api-access-fkwlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.015313 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkwlg\" (UniqueName: \"kubernetes.io/projected/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-kube-api-access-fkwlg\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.015358 4839 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2ddd6fb-1042-49a9-a76c-d00f5710a0fd-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.370193 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.370210 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567820-vhdmf" event={"ID":"e2ddd6fb-1042-49a9-a76c-d00f5710a0fd","Type":"ContainerDied","Data":"61268ffe5b7b4cc3afd1ac0ee773199d0f1f59992991e1326fb41d82c6e03302"} Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.370275 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61268ffe5b7b4cc3afd1ac0ee773199d0f1f59992991e1326fb41d82c6e03302" Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.414159 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48"] Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.422167 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567775-lfv48"] Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.477818 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0368223e-2e01-4681-a7a6-67b77387f8d8" path="/var/lib/kubelet/pods/0368223e-2e01-4681-a7a6-67b77387f8d8/volumes" Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.699965 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567820-drjbh" Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.828185 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lr8z\" (UniqueName: \"kubernetes.io/projected/2a082320-155d-4eb3-9779-9c6bb4db2b77-kube-api-access-5lr8z\") pod \"2a082320-155d-4eb3-9779-9c6bb4db2b77\" (UID: \"2a082320-155d-4eb3-9779-9c6bb4db2b77\") " Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.834467 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a082320-155d-4eb3-9779-9c6bb4db2b77-kube-api-access-5lr8z" (OuterVolumeSpecName: "kube-api-access-5lr8z") pod "2a082320-155d-4eb3-9779-9c6bb4db2b77" (UID: "2a082320-155d-4eb3-9779-9c6bb4db2b77"). InnerVolumeSpecName "kube-api-access-5lr8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:00:04 crc kubenswrapper[4839]: I0321 05:00:04.930467 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lr8z\" (UniqueName: \"kubernetes.io/projected/2a082320-155d-4eb3-9779-9c6bb4db2b77-kube-api-access-5lr8z\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:05 crc kubenswrapper[4839]: I0321 05:00:05.379252 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567820-drjbh" event={"ID":"2a082320-155d-4eb3-9779-9c6bb4db2b77","Type":"ContainerDied","Data":"f351d4aa8d34f3a05caa31e580fbae00b07a058f794284e383a28c466298b37b"} Mar 21 05:00:05 crc kubenswrapper[4839]: I0321 05:00:05.379287 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f351d4aa8d34f3a05caa31e580fbae00b07a058f794284e383a28c466298b37b" Mar 21 05:00:05 crc kubenswrapper[4839]: I0321 05:00:05.379351 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567820-drjbh" Mar 21 05:00:05 crc kubenswrapper[4839]: I0321 05:00:05.756408 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567814-q8zxw"] Mar 21 05:00:05 crc kubenswrapper[4839]: I0321 05:00:05.785389 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567814-q8zxw"] Mar 21 05:00:06 crc kubenswrapper[4839]: I0321 05:00:06.465932 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="852785cf-c79d-4c8e-92f0-f15d9836b437" path="/var/lib/kubelet/pods/852785cf-c79d-4c8e-92f0-f15d9836b437/volumes" Mar 21 05:00:30 crc kubenswrapper[4839]: I0321 05:00:30.980061 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:00:30 crc kubenswrapper[4839]: I0321 05:00:30.981743 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.720838 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tbnr8"] Mar 21 05:00:34 crc kubenswrapper[4839]: E0321 05:00:34.721656 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a082320-155d-4eb3-9779-9c6bb4db2b77" containerName="oc" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.721670 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a082320-155d-4eb3-9779-9c6bb4db2b77" containerName="oc" Mar 21 05:00:34 crc kubenswrapper[4839]: E0321 05:00:34.721691 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2ddd6fb-1042-49a9-a76c-d00f5710a0fd" containerName="collect-profiles" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.721698 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2ddd6fb-1042-49a9-a76c-d00f5710a0fd" containerName="collect-profiles" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.721896 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2ddd6fb-1042-49a9-a76c-d00f5710a0fd" containerName="collect-profiles" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.721920 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a082320-155d-4eb3-9779-9c6bb4db2b77" containerName="oc" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.724875 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.759641 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tbnr8"] Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.854825 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-catalog-content\") pod \"community-operators-tbnr8\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.854927 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hpzw\" (UniqueName: \"kubernetes.io/projected/c7515499-6e18-46fa-b97d-583a44f6066d-kube-api-access-2hpzw\") pod \"community-operators-tbnr8\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.854956 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-utilities\") pod \"community-operators-tbnr8\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.956245 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hpzw\" (UniqueName: \"kubernetes.io/projected/c7515499-6e18-46fa-b97d-583a44f6066d-kube-api-access-2hpzw\") pod \"community-operators-tbnr8\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.956310 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-utilities\") pod \"community-operators-tbnr8\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.956517 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-catalog-content\") pod \"community-operators-tbnr8\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.957016 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-utilities\") pod \"community-operators-tbnr8\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.957036 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-catalog-content\") pod \"community-operators-tbnr8\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:34 crc kubenswrapper[4839]: I0321 05:00:34.976065 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hpzw\" (UniqueName: \"kubernetes.io/projected/c7515499-6e18-46fa-b97d-583a44f6066d-kube-api-access-2hpzw\") pod \"community-operators-tbnr8\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:35 crc kubenswrapper[4839]: I0321 05:00:35.043307 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:35 crc kubenswrapper[4839]: I0321 05:00:35.410770 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tbnr8"] Mar 21 05:00:35 crc kubenswrapper[4839]: W0321 05:00:35.425936 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7515499_6e18_46fa_b97d_583a44f6066d.slice/crio-5f670a4d49fe6161a1487413471ff8ef4ad0dc5a4980ee849cc16304389b019d WatchSource:0}: Error finding container 5f670a4d49fe6161a1487413471ff8ef4ad0dc5a4980ee849cc16304389b019d: Status 404 returned error can't find the container with id 5f670a4d49fe6161a1487413471ff8ef4ad0dc5a4980ee849cc16304389b019d Mar 21 05:00:35 crc kubenswrapper[4839]: I0321 05:00:35.711097 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbnr8" event={"ID":"c7515499-6e18-46fa-b97d-583a44f6066d","Type":"ContainerStarted","Data":"5f670a4d49fe6161a1487413471ff8ef4ad0dc5a4980ee849cc16304389b019d"} Mar 21 05:00:36 crc kubenswrapper[4839]: I0321 05:00:36.742808 4839 generic.go:334] "Generic (PLEG): container finished" podID="c7515499-6e18-46fa-b97d-583a44f6066d" containerID="8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261" exitCode=0 Mar 21 05:00:36 crc kubenswrapper[4839]: I0321 05:00:36.742924 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbnr8" event={"ID":"c7515499-6e18-46fa-b97d-583a44f6066d","Type":"ContainerDied","Data":"8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261"} Mar 21 05:00:36 crc kubenswrapper[4839]: I0321 05:00:36.746103 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:00:38 crc kubenswrapper[4839]: I0321 05:00:38.766812 4839 generic.go:334] "Generic (PLEG): container finished" podID="c7515499-6e18-46fa-b97d-583a44f6066d" containerID="9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958" exitCode=0 Mar 21 05:00:38 crc kubenswrapper[4839]: I0321 05:00:38.767165 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbnr8" event={"ID":"c7515499-6e18-46fa-b97d-583a44f6066d","Type":"ContainerDied","Data":"9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958"} Mar 21 05:00:40 crc kubenswrapper[4839]: I0321 05:00:40.789765 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbnr8" event={"ID":"c7515499-6e18-46fa-b97d-583a44f6066d","Type":"ContainerStarted","Data":"0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8"} Mar 21 05:00:45 crc kubenswrapper[4839]: I0321 05:00:45.044470 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:45 crc kubenswrapper[4839]: I0321 05:00:45.045235 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:45 crc kubenswrapper[4839]: I0321 05:00:45.117196 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:45 crc kubenswrapper[4839]: I0321 05:00:45.142759 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tbnr8" podStartSLOduration=8.150740645 podStartE2EDuration="11.142744052s" podCreationTimestamp="2026-03-21 05:00:34 +0000 UTC" firstStartedPulling="2026-03-21 05:00:36.745823917 +0000 UTC m=+2241.073610593" lastFinishedPulling="2026-03-21 05:00:39.737827324 +0000 UTC m=+2244.065614000" observedRunningTime="2026-03-21 05:00:40.820750007 +0000 UTC m=+2245.148536713" watchObservedRunningTime="2026-03-21 05:00:45.142744052 +0000 UTC m=+2249.470530728" Mar 21 05:00:45 crc kubenswrapper[4839]: I0321 05:00:45.891550 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:45 crc kubenswrapper[4839]: I0321 05:00:45.951194 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tbnr8"] Mar 21 05:00:47 crc kubenswrapper[4839]: I0321 05:00:47.860519 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tbnr8" podUID="c7515499-6e18-46fa-b97d-583a44f6066d" containerName="registry-server" containerID="cri-o://0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8" gracePeriod=2 Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.346310 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.482328 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hpzw\" (UniqueName: \"kubernetes.io/projected/c7515499-6e18-46fa-b97d-583a44f6066d-kube-api-access-2hpzw\") pod \"c7515499-6e18-46fa-b97d-583a44f6066d\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.483298 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-catalog-content\") pod \"c7515499-6e18-46fa-b97d-583a44f6066d\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.483338 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-utilities\") pod \"c7515499-6e18-46fa-b97d-583a44f6066d\" (UID: \"c7515499-6e18-46fa-b97d-583a44f6066d\") " Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.485237 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-utilities" (OuterVolumeSpecName: "utilities") pod "c7515499-6e18-46fa-b97d-583a44f6066d" (UID: "c7515499-6e18-46fa-b97d-583a44f6066d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.487975 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.488548 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7515499-6e18-46fa-b97d-583a44f6066d-kube-api-access-2hpzw" (OuterVolumeSpecName: "kube-api-access-2hpzw") pod "c7515499-6e18-46fa-b97d-583a44f6066d" (UID: "c7515499-6e18-46fa-b97d-583a44f6066d"). InnerVolumeSpecName "kube-api-access-2hpzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.557009 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7515499-6e18-46fa-b97d-583a44f6066d" (UID: "c7515499-6e18-46fa-b97d-583a44f6066d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.589284 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hpzw\" (UniqueName: \"kubernetes.io/projected/c7515499-6e18-46fa-b97d-583a44f6066d-kube-api-access-2hpzw\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.589325 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7515499-6e18-46fa-b97d-583a44f6066d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.873632 4839 generic.go:334] "Generic (PLEG): container finished" podID="c7515499-6e18-46fa-b97d-583a44f6066d" containerID="0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8" exitCode=0 Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.873731 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbnr8" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.873721 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbnr8" event={"ID":"c7515499-6e18-46fa-b97d-583a44f6066d","Type":"ContainerDied","Data":"0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8"} Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.874022 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbnr8" event={"ID":"c7515499-6e18-46fa-b97d-583a44f6066d","Type":"ContainerDied","Data":"5f670a4d49fe6161a1487413471ff8ef4ad0dc5a4980ee849cc16304389b019d"} Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.874090 4839 scope.go:117] "RemoveContainer" containerID="0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.895385 4839 scope.go:117] "RemoveContainer" containerID="9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.915643 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tbnr8"] Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.934612 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tbnr8"] Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.947815 4839 scope.go:117] "RemoveContainer" containerID="8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.969057 4839 scope.go:117] "RemoveContainer" containerID="0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8" Mar 21 05:00:48 crc kubenswrapper[4839]: E0321 05:00:48.969953 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8\": container with ID starting with 0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8 not found: ID does not exist" containerID="0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.970026 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8"} err="failed to get container status \"0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8\": rpc error: code = NotFound desc = could not find container \"0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8\": container with ID starting with 0cd99d8ae19983c08e303421faad1e7ef1e440e1aed0f32dc1ab671651ede9f8 not found: ID does not exist" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.970053 4839 scope.go:117] "RemoveContainer" containerID="9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958" Mar 21 05:00:48 crc kubenswrapper[4839]: E0321 05:00:48.970452 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958\": container with ID starting with 9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958 not found: ID does not exist" containerID="9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.970492 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958"} err="failed to get container status \"9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958\": rpc error: code = NotFound desc = could not find container \"9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958\": container with ID starting with 9491568322b114d0f3fba095f6fad14915b4536f38978a8091edbbcd436ed958 not found: ID does not exist" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.970516 4839 scope.go:117] "RemoveContainer" containerID="8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261" Mar 21 05:00:48 crc kubenswrapper[4839]: E0321 05:00:48.970762 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261\": container with ID starting with 8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261 not found: ID does not exist" containerID="8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261" Mar 21 05:00:48 crc kubenswrapper[4839]: I0321 05:00:48.970783 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261"} err="failed to get container status \"8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261\": rpc error: code = NotFound desc = could not find container \"8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261\": container with ID starting with 8359023cb97d896a3fdf49891f803b8793fefe959ee570e1e9454337ddc48261 not found: ID does not exist" Mar 21 05:00:50 crc kubenswrapper[4839]: I0321 05:00:50.463519 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7515499-6e18-46fa-b97d-583a44f6066d" path="/var/lib/kubelet/pods/c7515499-6e18-46fa-b97d-583a44f6066d/volumes" Mar 21 05:00:55 crc kubenswrapper[4839]: I0321 05:00:55.610446 4839 scope.go:117] "RemoveContainer" containerID="d122e9d27915a31245552d8140bcb2b6f44ab9e8f5d0f2da420a748e2a0ab38c" Mar 21 05:00:55 crc kubenswrapper[4839]: I0321 05:00:55.660711 4839 scope.go:117] "RemoveContainer" containerID="8dc51ff3af9bc295da39ecd84349288a171e09e13e9355c5592ecc0b1f1951e7" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.154023 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29567821-rmctn"] Mar 21 05:01:00 crc kubenswrapper[4839]: E0321 05:01:00.156351 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7515499-6e18-46fa-b97d-583a44f6066d" containerName="extract-content" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.156376 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7515499-6e18-46fa-b97d-583a44f6066d" containerName="extract-content" Mar 21 05:01:00 crc kubenswrapper[4839]: E0321 05:01:00.156408 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7515499-6e18-46fa-b97d-583a44f6066d" containerName="extract-utilities" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.156416 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7515499-6e18-46fa-b97d-583a44f6066d" containerName="extract-utilities" Mar 21 05:01:00 crc kubenswrapper[4839]: E0321 05:01:00.156435 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7515499-6e18-46fa-b97d-583a44f6066d" containerName="registry-server" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.156442 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7515499-6e18-46fa-b97d-583a44f6066d" containerName="registry-server" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.156706 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7515499-6e18-46fa-b97d-583a44f6066d" containerName="registry-server" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.157531 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.163775 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567821-rmctn"] Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.305713 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-fernet-keys\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.305776 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-combined-ca-bundle\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.305831 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj9gr\" (UniqueName: \"kubernetes.io/projected/666be2f4-0416-4086-94d3-c48c82f380b2-kube-api-access-hj9gr\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.305882 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-config-data\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.407351 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj9gr\" (UniqueName: \"kubernetes.io/projected/666be2f4-0416-4086-94d3-c48c82f380b2-kube-api-access-hj9gr\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.407430 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-config-data\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.407961 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-fernet-keys\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.407994 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-combined-ca-bundle\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.417773 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-combined-ca-bundle\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.417891 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-fernet-keys\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.418005 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-config-data\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.430071 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj9gr\" (UniqueName: \"kubernetes.io/projected/666be2f4-0416-4086-94d3-c48c82f380b2-kube-api-access-hj9gr\") pod \"keystone-cron-29567821-rmctn\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.474404 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.928522 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567821-rmctn"] Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.980638 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.980703 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:01:00 crc kubenswrapper[4839]: I0321 05:01:00.991799 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567821-rmctn" event={"ID":"666be2f4-0416-4086-94d3-c48c82f380b2","Type":"ContainerStarted","Data":"84a267c39feac16f072a5873d4f95a7263c12a65583247dd89372cebb746437c"} Mar 21 05:01:02 crc kubenswrapper[4839]: I0321 05:01:02.002465 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567821-rmctn" event={"ID":"666be2f4-0416-4086-94d3-c48c82f380b2","Type":"ContainerStarted","Data":"04d4f720b62c166e071450a4c2f749a516bf7a6bafc166466380b4293d53da5c"} Mar 21 05:01:02 crc kubenswrapper[4839]: I0321 05:01:02.028849 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29567821-rmctn" podStartSLOduration=2.028834864 podStartE2EDuration="2.028834864s" podCreationTimestamp="2026-03-21 05:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:01:02.025271404 +0000 UTC m=+2266.353058080" watchObservedRunningTime="2026-03-21 05:01:02.028834864 +0000 UTC m=+2266.356621540" Mar 21 05:01:04 crc kubenswrapper[4839]: I0321 05:01:04.021044 4839 generic.go:334] "Generic (PLEG): container finished" podID="666be2f4-0416-4086-94d3-c48c82f380b2" containerID="04d4f720b62c166e071450a4c2f749a516bf7a6bafc166466380b4293d53da5c" exitCode=0 Mar 21 05:01:04 crc kubenswrapper[4839]: I0321 05:01:04.021127 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567821-rmctn" event={"ID":"666be2f4-0416-4086-94d3-c48c82f380b2","Type":"ContainerDied","Data":"04d4f720b62c166e071450a4c2f749a516bf7a6bafc166466380b4293d53da5c"} Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.451326 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.602700 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj9gr\" (UniqueName: \"kubernetes.io/projected/666be2f4-0416-4086-94d3-c48c82f380b2-kube-api-access-hj9gr\") pod \"666be2f4-0416-4086-94d3-c48c82f380b2\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.602833 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-combined-ca-bundle\") pod \"666be2f4-0416-4086-94d3-c48c82f380b2\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.602943 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-fernet-keys\") pod \"666be2f4-0416-4086-94d3-c48c82f380b2\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.602965 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-config-data\") pod \"666be2f4-0416-4086-94d3-c48c82f380b2\" (UID: \"666be2f4-0416-4086-94d3-c48c82f380b2\") " Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.612693 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "666be2f4-0416-4086-94d3-c48c82f380b2" (UID: "666be2f4-0416-4086-94d3-c48c82f380b2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.618695 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/666be2f4-0416-4086-94d3-c48c82f380b2-kube-api-access-hj9gr" (OuterVolumeSpecName: "kube-api-access-hj9gr") pod "666be2f4-0416-4086-94d3-c48c82f380b2" (UID: "666be2f4-0416-4086-94d3-c48c82f380b2"). InnerVolumeSpecName "kube-api-access-hj9gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.647913 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "666be2f4-0416-4086-94d3-c48c82f380b2" (UID: "666be2f4-0416-4086-94d3-c48c82f380b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.680744 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-config-data" (OuterVolumeSpecName: "config-data") pod "666be2f4-0416-4086-94d3-c48c82f380b2" (UID: "666be2f4-0416-4086-94d3-c48c82f380b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.704964 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj9gr\" (UniqueName: \"kubernetes.io/projected/666be2f4-0416-4086-94d3-c48c82f380b2-kube-api-access-hj9gr\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.705012 4839 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.705023 4839 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:05 crc kubenswrapper[4839]: I0321 05:01:05.705035 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/666be2f4-0416-4086-94d3-c48c82f380b2-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:01:06 crc kubenswrapper[4839]: I0321 05:01:06.044644 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567821-rmctn" event={"ID":"666be2f4-0416-4086-94d3-c48c82f380b2","Type":"ContainerDied","Data":"84a267c39feac16f072a5873d4f95a7263c12a65583247dd89372cebb746437c"} Mar 21 05:01:06 crc kubenswrapper[4839]: I0321 05:01:06.045015 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84a267c39feac16f072a5873d4f95a7263c12a65583247dd89372cebb746437c" Mar 21 05:01:06 crc kubenswrapper[4839]: I0321 05:01:06.045103 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567821-rmctn" Mar 21 05:01:30 crc kubenswrapper[4839]: I0321 05:01:30.980486 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:01:30 crc kubenswrapper[4839]: I0321 05:01:30.981084 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:01:30 crc kubenswrapper[4839]: I0321 05:01:30.981132 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 05:01:30 crc kubenswrapper[4839]: I0321 05:01:30.981890 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3d8592746d13c0ed95f298f8a3279e3766ac0141ca420e1630c7b54039959ed"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:01:30 crc kubenswrapper[4839]: I0321 05:01:30.981955 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://f3d8592746d13c0ed95f298f8a3279e3766ac0141ca420e1630c7b54039959ed" gracePeriod=600 Mar 21 05:01:31 crc kubenswrapper[4839]: I0321 05:01:31.267147 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="f3d8592746d13c0ed95f298f8a3279e3766ac0141ca420e1630c7b54039959ed" exitCode=0 Mar 21 05:01:31 crc kubenswrapper[4839]: I0321 05:01:31.267231 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"f3d8592746d13c0ed95f298f8a3279e3766ac0141ca420e1630c7b54039959ed"} Mar 21 05:01:31 crc kubenswrapper[4839]: I0321 05:01:31.267471 4839 scope.go:117] "RemoveContainer" containerID="27713c304335ae18a33ce2c1e07fb2a32c8ccd655a1db46168d0e6b4f0325109" Mar 21 05:01:32 crc kubenswrapper[4839]: I0321 05:01:32.278160 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21"} Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.144561 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567822-666xq"] Mar 21 05:02:00 crc kubenswrapper[4839]: E0321 05:02:00.146404 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="666be2f4-0416-4086-94d3-c48c82f380b2" containerName="keystone-cron" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.146485 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="666be2f4-0416-4086-94d3-c48c82f380b2" containerName="keystone-cron" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.146739 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="666be2f4-0416-4086-94d3-c48c82f380b2" containerName="keystone-cron" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.147514 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567822-666xq" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.149415 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.149697 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.149722 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.152639 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567822-666xq"] Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.219057 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7vdz\" (UniqueName: \"kubernetes.io/projected/5246ade9-02c7-4a6c-b903-f556b6405d03-kube-api-access-k7vdz\") pod \"auto-csr-approver-29567822-666xq\" (UID: \"5246ade9-02c7-4a6c-b903-f556b6405d03\") " pod="openshift-infra/auto-csr-approver-29567822-666xq" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.320366 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7vdz\" (UniqueName: \"kubernetes.io/projected/5246ade9-02c7-4a6c-b903-f556b6405d03-kube-api-access-k7vdz\") pod \"auto-csr-approver-29567822-666xq\" (UID: \"5246ade9-02c7-4a6c-b903-f556b6405d03\") " pod="openshift-infra/auto-csr-approver-29567822-666xq" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.339816 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7vdz\" (UniqueName: \"kubernetes.io/projected/5246ade9-02c7-4a6c-b903-f556b6405d03-kube-api-access-k7vdz\") pod \"auto-csr-approver-29567822-666xq\" (UID: \"5246ade9-02c7-4a6c-b903-f556b6405d03\") " pod="openshift-infra/auto-csr-approver-29567822-666xq" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.467459 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567822-666xq" Mar 21 05:02:00 crc kubenswrapper[4839]: I0321 05:02:00.894786 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567822-666xq"] Mar 21 05:02:01 crc kubenswrapper[4839]: I0321 05:02:01.564385 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567822-666xq" event={"ID":"5246ade9-02c7-4a6c-b903-f556b6405d03","Type":"ContainerStarted","Data":"6d1fe1e4a0918fd8ee0bb394d0dab442d8e00841c7e225f3a4940e9ef79b272f"} Mar 21 05:02:03 crc kubenswrapper[4839]: I0321 05:02:03.581695 4839 generic.go:334] "Generic (PLEG): container finished" podID="5246ade9-02c7-4a6c-b903-f556b6405d03" containerID="292e8e6107a41a041900da65d2d65595f3753b426bbb8c2a06a65273c04fd6b1" exitCode=0 Mar 21 05:02:03 crc kubenswrapper[4839]: I0321 05:02:03.581784 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567822-666xq" event={"ID":"5246ade9-02c7-4a6c-b903-f556b6405d03","Type":"ContainerDied","Data":"292e8e6107a41a041900da65d2d65595f3753b426bbb8c2a06a65273c04fd6b1"} Mar 21 05:02:04 crc kubenswrapper[4839]: I0321 05:02:04.951728 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567822-666xq" Mar 21 05:02:05 crc kubenswrapper[4839]: I0321 05:02:05.016950 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7vdz\" (UniqueName: \"kubernetes.io/projected/5246ade9-02c7-4a6c-b903-f556b6405d03-kube-api-access-k7vdz\") pod \"5246ade9-02c7-4a6c-b903-f556b6405d03\" (UID: \"5246ade9-02c7-4a6c-b903-f556b6405d03\") " Mar 21 05:02:05 crc kubenswrapper[4839]: I0321 05:02:05.022899 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5246ade9-02c7-4a6c-b903-f556b6405d03-kube-api-access-k7vdz" (OuterVolumeSpecName: "kube-api-access-k7vdz") pod "5246ade9-02c7-4a6c-b903-f556b6405d03" (UID: "5246ade9-02c7-4a6c-b903-f556b6405d03"). InnerVolumeSpecName "kube-api-access-k7vdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:02:05 crc kubenswrapper[4839]: I0321 05:02:05.118856 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7vdz\" (UniqueName: \"kubernetes.io/projected/5246ade9-02c7-4a6c-b903-f556b6405d03-kube-api-access-k7vdz\") on node \"crc\" DevicePath \"\"" Mar 21 05:02:05 crc kubenswrapper[4839]: I0321 05:02:05.620775 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567822-666xq" event={"ID":"5246ade9-02c7-4a6c-b903-f556b6405d03","Type":"ContainerDied","Data":"6d1fe1e4a0918fd8ee0bb394d0dab442d8e00841c7e225f3a4940e9ef79b272f"} Mar 21 05:02:05 crc kubenswrapper[4839]: I0321 05:02:05.621298 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d1fe1e4a0918fd8ee0bb394d0dab442d8e00841c7e225f3a4940e9ef79b272f" Mar 21 05:02:05 crc kubenswrapper[4839]: I0321 05:02:05.620837 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567822-666xq" Mar 21 05:02:06 crc kubenswrapper[4839]: I0321 05:02:06.016449 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567816-8qfld"] Mar 21 05:02:06 crc kubenswrapper[4839]: I0321 05:02:06.022842 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567816-8qfld"] Mar 21 05:02:06 crc kubenswrapper[4839]: I0321 05:02:06.463091 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8" path="/var/lib/kubelet/pods/f5c6cd43-2bb2-46cd-90ec-b1037aaae8a8/volumes" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.482945 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x7bww"] Mar 21 05:02:10 crc kubenswrapper[4839]: E0321 05:02:10.485470 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5246ade9-02c7-4a6c-b903-f556b6405d03" containerName="oc" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.485494 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5246ade9-02c7-4a6c-b903-f556b6405d03" containerName="oc" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.485694 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5246ade9-02c7-4a6c-b903-f556b6405d03" containerName="oc" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.487278 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.501177 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7bww"] Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.558038 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-utilities\") pod \"redhat-marketplace-x7bww\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.558428 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk578\" (UniqueName: \"kubernetes.io/projected/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-kube-api-access-hk578\") pod \"redhat-marketplace-x7bww\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.558496 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-catalog-content\") pod \"redhat-marketplace-x7bww\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.662211 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk578\" (UniqueName: \"kubernetes.io/projected/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-kube-api-access-hk578\") pod \"redhat-marketplace-x7bww\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.662324 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-catalog-content\") pod \"redhat-marketplace-x7bww\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.662508 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-utilities\") pod \"redhat-marketplace-x7bww\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.663130 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-catalog-content\") pod \"redhat-marketplace-x7bww\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.663196 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-utilities\") pod \"redhat-marketplace-x7bww\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.683787 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk578\" (UniqueName: \"kubernetes.io/projected/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-kube-api-access-hk578\") pod \"redhat-marketplace-x7bww\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:10 crc kubenswrapper[4839]: I0321 05:02:10.808836 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:11 crc kubenswrapper[4839]: I0321 05:02:11.310692 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7bww"] Mar 21 05:02:11 crc kubenswrapper[4839]: I0321 05:02:11.672046 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7bww" event={"ID":"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3","Type":"ContainerStarted","Data":"373a8c5aa8fdbb58b3ae94a725b57b695a91f013d0b4d98618d1c738df727506"} Mar 21 05:02:12 crc kubenswrapper[4839]: I0321 05:02:12.682174 4839 generic.go:334] "Generic (PLEG): container finished" podID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerID="1efc52951d43a245177e4dff7cfd1e3426fa930b28133acb294aea6743e70139" exitCode=0 Mar 21 05:02:12 crc kubenswrapper[4839]: I0321 05:02:12.682233 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7bww" event={"ID":"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3","Type":"ContainerDied","Data":"1efc52951d43a245177e4dff7cfd1e3426fa930b28133acb294aea6743e70139"} Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.482669 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pb97p"] Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.484985 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.493391 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pb97p"] Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.616999 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-catalog-content\") pod \"certified-operators-pb97p\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.617130 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgwg2\" (UniqueName: \"kubernetes.io/projected/f939e367-e323-4cac-85d0-55d26d60f4ec-kube-api-access-sgwg2\") pod \"certified-operators-pb97p\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.617187 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-utilities\") pod \"certified-operators-pb97p\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.719242 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgwg2\" (UniqueName: \"kubernetes.io/projected/f939e367-e323-4cac-85d0-55d26d60f4ec-kube-api-access-sgwg2\") pod \"certified-operators-pb97p\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.719584 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-utilities\") pod \"certified-operators-pb97p\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.719833 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-catalog-content\") pod \"certified-operators-pb97p\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.720235 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-utilities\") pod \"certified-operators-pb97p\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.720260 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-catalog-content\") pod \"certified-operators-pb97p\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.742515 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgwg2\" (UniqueName: \"kubernetes.io/projected/f939e367-e323-4cac-85d0-55d26d60f4ec-kube-api-access-sgwg2\") pod \"certified-operators-pb97p\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:13 crc kubenswrapper[4839]: I0321 05:02:13.807597 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:14 crc kubenswrapper[4839]: W0321 05:02:14.325092 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf939e367_e323_4cac_85d0_55d26d60f4ec.slice/crio-226aff01fcb6e9c576f943e6b8a6622749c456bb21bf2cb5df8b9ab8362e0e3e WatchSource:0}: Error finding container 226aff01fcb6e9c576f943e6b8a6622749c456bb21bf2cb5df8b9ab8362e0e3e: Status 404 returned error can't find the container with id 226aff01fcb6e9c576f943e6b8a6622749c456bb21bf2cb5df8b9ab8362e0e3e Mar 21 05:02:14 crc kubenswrapper[4839]: I0321 05:02:14.325808 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pb97p"] Mar 21 05:02:14 crc kubenswrapper[4839]: I0321 05:02:14.701871 4839 generic.go:334] "Generic (PLEG): container finished" podID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerID="4b22a92b0ab8fcff90ca92aa57b5aa47ae8ec5ba62184c302116eb1d72e13a3d" exitCode=0 Mar 21 05:02:14 crc kubenswrapper[4839]: I0321 05:02:14.701969 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7bww" event={"ID":"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3","Type":"ContainerDied","Data":"4b22a92b0ab8fcff90ca92aa57b5aa47ae8ec5ba62184c302116eb1d72e13a3d"} Mar 21 05:02:14 crc kubenswrapper[4839]: I0321 05:02:14.704352 4839 generic.go:334] "Generic (PLEG): container finished" podID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerID="05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55" exitCode=0 Mar 21 05:02:14 crc kubenswrapper[4839]: I0321 05:02:14.704390 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pb97p" event={"ID":"f939e367-e323-4cac-85d0-55d26d60f4ec","Type":"ContainerDied","Data":"05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55"} Mar 21 05:02:14 crc kubenswrapper[4839]: I0321 05:02:14.704413 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pb97p" event={"ID":"f939e367-e323-4cac-85d0-55d26d60f4ec","Type":"ContainerStarted","Data":"226aff01fcb6e9c576f943e6b8a6622749c456bb21bf2cb5df8b9ab8362e0e3e"} Mar 21 05:02:15 crc kubenswrapper[4839]: I0321 05:02:15.715859 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7bww" event={"ID":"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3","Type":"ContainerStarted","Data":"b9de2139979cd92ed9f8ddab30d1a42bd7d67a863ada145cf6cbe71703537956"} Mar 21 05:02:15 crc kubenswrapper[4839]: I0321 05:02:15.718472 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pb97p" event={"ID":"f939e367-e323-4cac-85d0-55d26d60f4ec","Type":"ContainerStarted","Data":"45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f"} Mar 21 05:02:15 crc kubenswrapper[4839]: I0321 05:02:15.740239 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x7bww" podStartSLOduration=3.075991648 podStartE2EDuration="5.740214745s" podCreationTimestamp="2026-03-21 05:02:10 +0000 UTC" firstStartedPulling="2026-03-21 05:02:12.683834041 +0000 UTC m=+2337.011620757" lastFinishedPulling="2026-03-21 05:02:15.348057178 +0000 UTC m=+2339.675843854" observedRunningTime="2026-03-21 05:02:15.733697071 +0000 UTC m=+2340.061483767" watchObservedRunningTime="2026-03-21 05:02:15.740214745 +0000 UTC m=+2340.068001421" Mar 21 05:02:16 crc kubenswrapper[4839]: I0321 05:02:16.728847 4839 generic.go:334] "Generic (PLEG): container finished" podID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerID="45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f" exitCode=0 Mar 21 05:02:16 crc kubenswrapper[4839]: I0321 05:02:16.728960 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pb97p" event={"ID":"f939e367-e323-4cac-85d0-55d26d60f4ec","Type":"ContainerDied","Data":"45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f"} Mar 21 05:02:20 crc kubenswrapper[4839]: I0321 05:02:20.810257 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:20 crc kubenswrapper[4839]: I0321 05:02:20.810891 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:20 crc kubenswrapper[4839]: I0321 05:02:20.858398 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:21 crc kubenswrapper[4839]: I0321 05:02:21.824099 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:21 crc kubenswrapper[4839]: I0321 05:02:21.868918 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7bww"] Mar 21 05:02:22 crc kubenswrapper[4839]: I0321 05:02:22.789541 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pb97p" event={"ID":"f939e367-e323-4cac-85d0-55d26d60f4ec","Type":"ContainerStarted","Data":"935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef"} Mar 21 05:02:23 crc kubenswrapper[4839]: I0321 05:02:23.799320 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x7bww" podUID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerName="registry-server" containerID="cri-o://b9de2139979cd92ed9f8ddab30d1a42bd7d67a863ada145cf6cbe71703537956" gracePeriod=2 Mar 21 05:02:23 crc kubenswrapper[4839]: I0321 05:02:23.808762 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:23 crc kubenswrapper[4839]: I0321 05:02:23.809209 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:23 crc kubenswrapper[4839]: I0321 05:02:23.819658 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pb97p" podStartSLOduration=4.034429463 podStartE2EDuration="10.819623626s" podCreationTimestamp="2026-03-21 05:02:13 +0000 UTC" firstStartedPulling="2026-03-21 05:02:14.706362406 +0000 UTC m=+2339.034149082" lastFinishedPulling="2026-03-21 05:02:21.491556569 +0000 UTC m=+2345.819343245" observedRunningTime="2026-03-21 05:02:23.817887397 +0000 UTC m=+2348.145674073" watchObservedRunningTime="2026-03-21 05:02:23.819623626 +0000 UTC m=+2348.147410302" Mar 21 05:02:24 crc kubenswrapper[4839]: I0321 05:02:24.863256 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-pb97p" podUID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerName="registry-server" probeResult="failure" output=< Mar 21 05:02:24 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 05:02:24 crc kubenswrapper[4839]: > Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.832971 4839 generic.go:334] "Generic (PLEG): container finished" podID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerID="b9de2139979cd92ed9f8ddab30d1a42bd7d67a863ada145cf6cbe71703537956" exitCode=0 Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.833156 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7bww" event={"ID":"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3","Type":"ContainerDied","Data":"b9de2139979cd92ed9f8ddab30d1a42bd7d67a863ada145cf6cbe71703537956"} Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.833303 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x7bww" event={"ID":"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3","Type":"ContainerDied","Data":"373a8c5aa8fdbb58b3ae94a725b57b695a91f013d0b4d98618d1c738df727506"} Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.833330 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="373a8c5aa8fdbb58b3ae94a725b57b695a91f013d0b4d98618d1c738df727506" Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.842817 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.950049 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-catalog-content\") pod \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.950292 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-utilities\") pod \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.950383 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk578\" (UniqueName: \"kubernetes.io/projected/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-kube-api-access-hk578\") pod \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\" (UID: \"1efcbe59-69a0-46b0-b47b-e7e6fd1502c3\") " Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.950986 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-utilities" (OuterVolumeSpecName: "utilities") pod "1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" (UID: "1efcbe59-69a0-46b0-b47b-e7e6fd1502c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.956012 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-kube-api-access-hk578" (OuterVolumeSpecName: "kube-api-access-hk578") pod "1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" (UID: "1efcbe59-69a0-46b0-b47b-e7e6fd1502c3"). InnerVolumeSpecName "kube-api-access-hk578". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:02:25 crc kubenswrapper[4839]: I0321 05:02:25.977524 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" (UID: "1efcbe59-69a0-46b0-b47b-e7e6fd1502c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:02:26 crc kubenswrapper[4839]: I0321 05:02:26.053166 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:02:26 crc kubenswrapper[4839]: I0321 05:02:26.053221 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk578\" (UniqueName: \"kubernetes.io/projected/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-kube-api-access-hk578\") on node \"crc\" DevicePath \"\"" Mar 21 05:02:26 crc kubenswrapper[4839]: I0321 05:02:26.053238 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:02:26 crc kubenswrapper[4839]: I0321 05:02:26.840149 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x7bww" Mar 21 05:02:26 crc kubenswrapper[4839]: I0321 05:02:26.864375 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7bww"] Mar 21 05:02:26 crc kubenswrapper[4839]: I0321 05:02:26.873474 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x7bww"] Mar 21 05:02:28 crc kubenswrapper[4839]: I0321 05:02:28.462065 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" path="/var/lib/kubelet/pods/1efcbe59-69a0-46b0-b47b-e7e6fd1502c3/volumes" Mar 21 05:02:33 crc kubenswrapper[4839]: I0321 05:02:33.859731 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:33 crc kubenswrapper[4839]: I0321 05:02:33.919205 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:34 crc kubenswrapper[4839]: I0321 05:02:34.099119 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pb97p"] Mar 21 05:02:34 crc kubenswrapper[4839]: I0321 05:02:34.903346 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pb97p" podUID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerName="registry-server" containerID="cri-o://935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef" gracePeriod=2 Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.371451 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.427588 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-catalog-content\") pod \"f939e367-e323-4cac-85d0-55d26d60f4ec\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.427848 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-utilities\") pod \"f939e367-e323-4cac-85d0-55d26d60f4ec\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.427968 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgwg2\" (UniqueName: \"kubernetes.io/projected/f939e367-e323-4cac-85d0-55d26d60f4ec-kube-api-access-sgwg2\") pod \"f939e367-e323-4cac-85d0-55d26d60f4ec\" (UID: \"f939e367-e323-4cac-85d0-55d26d60f4ec\") " Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.428864 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-utilities" (OuterVolumeSpecName: "utilities") pod "f939e367-e323-4cac-85d0-55d26d60f4ec" (UID: "f939e367-e323-4cac-85d0-55d26d60f4ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.440970 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f939e367-e323-4cac-85d0-55d26d60f4ec-kube-api-access-sgwg2" (OuterVolumeSpecName: "kube-api-access-sgwg2") pod "f939e367-e323-4cac-85d0-55d26d60f4ec" (UID: "f939e367-e323-4cac-85d0-55d26d60f4ec"). InnerVolumeSpecName "kube-api-access-sgwg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.486188 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f939e367-e323-4cac-85d0-55d26d60f4ec" (UID: "f939e367-e323-4cac-85d0-55d26d60f4ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.532278 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.532412 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f939e367-e323-4cac-85d0-55d26d60f4ec-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.532437 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgwg2\" (UniqueName: \"kubernetes.io/projected/f939e367-e323-4cac-85d0-55d26d60f4ec-kube-api-access-sgwg2\") on node \"crc\" DevicePath \"\"" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.913837 4839 generic.go:334] "Generic (PLEG): container finished" podID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerID="935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef" exitCode=0 Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.913903 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pb97p" event={"ID":"f939e367-e323-4cac-85d0-55d26d60f4ec","Type":"ContainerDied","Data":"935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef"} Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.913934 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pb97p" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.913979 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pb97p" event={"ID":"f939e367-e323-4cac-85d0-55d26d60f4ec","Type":"ContainerDied","Data":"226aff01fcb6e9c576f943e6b8a6622749c456bb21bf2cb5df8b9ab8362e0e3e"} Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.914016 4839 scope.go:117] "RemoveContainer" containerID="935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.938655 4839 scope.go:117] "RemoveContainer" containerID="45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.962019 4839 scope.go:117] "RemoveContainer" containerID="05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55" Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.965785 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pb97p"] Mar 21 05:02:35 crc kubenswrapper[4839]: I0321 05:02:35.974216 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pb97p"] Mar 21 05:02:36 crc kubenswrapper[4839]: I0321 05:02:36.013149 4839 scope.go:117] "RemoveContainer" containerID="935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef" Mar 21 05:02:36 crc kubenswrapper[4839]: E0321 05:02:36.013915 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef\": container with ID starting with 935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef not found: ID does not exist" containerID="935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef" Mar 21 05:02:36 crc kubenswrapper[4839]: I0321 05:02:36.013960 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef"} err="failed to get container status \"935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef\": rpc error: code = NotFound desc = could not find container \"935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef\": container with ID starting with 935f7c36e9c71c4fca49f4aab7c929a57af2a445189275ccfc5955dcd95764ef not found: ID does not exist" Mar 21 05:02:36 crc kubenswrapper[4839]: I0321 05:02:36.013986 4839 scope.go:117] "RemoveContainer" containerID="45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f" Mar 21 05:02:36 crc kubenswrapper[4839]: E0321 05:02:36.014410 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f\": container with ID starting with 45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f not found: ID does not exist" containerID="45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f" Mar 21 05:02:36 crc kubenswrapper[4839]: I0321 05:02:36.014438 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f"} err="failed to get container status \"45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f\": rpc error: code = NotFound desc = could not find container \"45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f\": container with ID starting with 45573170de201923aa2e043b497080bdb7b19d0d4e30d2bc220f9d78a14fec2f not found: ID does not exist" Mar 21 05:02:36 crc kubenswrapper[4839]: I0321 05:02:36.014461 4839 scope.go:117] "RemoveContainer" containerID="05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55" Mar 21 05:02:36 crc kubenswrapper[4839]: E0321 05:02:36.014743 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55\": container with ID starting with 05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55 not found: ID does not exist" containerID="05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55" Mar 21 05:02:36 crc kubenswrapper[4839]: I0321 05:02:36.014766 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55"} err="failed to get container status \"05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55\": rpc error: code = NotFound desc = could not find container \"05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55\": container with ID starting with 05536196eb9060091b02cda5458e67b22a4301abe94bd268ab45cbb2e462af55 not found: ID does not exist" Mar 21 05:02:36 crc kubenswrapper[4839]: I0321 05:02:36.467128 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f939e367-e323-4cac-85d0-55d26d60f4ec" path="/var/lib/kubelet/pods/f939e367-e323-4cac-85d0-55d26d60f4ec/volumes" Mar 21 05:02:55 crc kubenswrapper[4839]: I0321 05:02:55.777666 4839 scope.go:117] "RemoveContainer" containerID="4fe2426cb283c93b9728be8cbc10600e5f92f98c8d9cf9800594541cb0424886" Mar 21 05:03:21 crc kubenswrapper[4839]: I0321 05:03:21.349552 4839 generic.go:334] "Generic (PLEG): container finished" podID="2d056acb-0183-4157-a830-fff4cd1dcacf" containerID="38923cbb1565ae7e426fbfa5a7cacab2c7dc20c694af3c1982bdf3aeaab3650d" exitCode=0 Mar 21 05:03:21 crc kubenswrapper[4839]: I0321 05:03:21.349659 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" event={"ID":"2d056acb-0183-4157-a830-fff4cd1dcacf","Type":"ContainerDied","Data":"38923cbb1565ae7e426fbfa5a7cacab2c7dc20c694af3c1982bdf3aeaab3650d"} Mar 21 05:03:22 crc kubenswrapper[4839]: I0321 05:03:22.847361 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.021341 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-inventory\") pod \"2d056acb-0183-4157-a830-fff4cd1dcacf\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.021446 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-secret-0\") pod \"2d056acb-0183-4157-a830-fff4cd1dcacf\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.021511 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-combined-ca-bundle\") pod \"2d056acb-0183-4157-a830-fff4cd1dcacf\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.021823 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-ssh-key-openstack-edpm-ipam\") pod \"2d056acb-0183-4157-a830-fff4cd1dcacf\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.021974 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf8mj\" (UniqueName: \"kubernetes.io/projected/2d056acb-0183-4157-a830-fff4cd1dcacf-kube-api-access-pf8mj\") pod \"2d056acb-0183-4157-a830-fff4cd1dcacf\" (UID: \"2d056acb-0183-4157-a830-fff4cd1dcacf\") " Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.028463 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "2d056acb-0183-4157-a830-fff4cd1dcacf" (UID: "2d056acb-0183-4157-a830-fff4cd1dcacf"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.032741 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d056acb-0183-4157-a830-fff4cd1dcacf-kube-api-access-pf8mj" (OuterVolumeSpecName: "kube-api-access-pf8mj") pod "2d056acb-0183-4157-a830-fff4cd1dcacf" (UID: "2d056acb-0183-4157-a830-fff4cd1dcacf"). InnerVolumeSpecName "kube-api-access-pf8mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.058874 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2d056acb-0183-4157-a830-fff4cd1dcacf" (UID: "2d056acb-0183-4157-a830-fff4cd1dcacf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.061184 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-inventory" (OuterVolumeSpecName: "inventory") pod "2d056acb-0183-4157-a830-fff4cd1dcacf" (UID: "2d056acb-0183-4157-a830-fff4cd1dcacf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.074824 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "2d056acb-0183-4157-a830-fff4cd1dcacf" (UID: "2d056acb-0183-4157-a830-fff4cd1dcacf"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.125159 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.125215 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pf8mj\" (UniqueName: \"kubernetes.io/projected/2d056acb-0183-4157-a830-fff4cd1dcacf-kube-api-access-pf8mj\") on node \"crc\" DevicePath \"\"" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.125236 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.125255 4839 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.125274 4839 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d056acb-0183-4157-a830-fff4cd1dcacf-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.373252 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" event={"ID":"2d056acb-0183-4157-a830-fff4cd1dcacf","Type":"ContainerDied","Data":"a47bbc08aef3ecca5f0392206f51820ac7164fd826b5332b0545a1c8dd6795ca"} Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.373718 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a47bbc08aef3ecca5f0392206f51820ac7164fd826b5332b0545a1c8dd6795ca" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.373303 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-w48j6" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.472862 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f"] Mar 21 05:03:23 crc kubenswrapper[4839]: E0321 05:03:23.473305 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerName="extract-content" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.473329 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerName="extract-content" Mar 21 05:03:23 crc kubenswrapper[4839]: E0321 05:03:23.473347 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerName="registry-server" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.473354 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerName="registry-server" Mar 21 05:03:23 crc kubenswrapper[4839]: E0321 05:03:23.473381 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerName="extract-content" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.473389 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerName="extract-content" Mar 21 05:03:23 crc kubenswrapper[4839]: E0321 05:03:23.473401 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerName="registry-server" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.473408 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerName="registry-server" Mar 21 05:03:23 crc kubenswrapper[4839]: E0321 05:03:23.473425 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerName="extract-utilities" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.473434 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerName="extract-utilities" Mar 21 05:03:23 crc kubenswrapper[4839]: E0321 05:03:23.473448 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d056acb-0183-4157-a830-fff4cd1dcacf" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.473457 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d056acb-0183-4157-a830-fff4cd1dcacf" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 21 05:03:23 crc kubenswrapper[4839]: E0321 05:03:23.473471 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerName="extract-utilities" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.473482 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerName="extract-utilities" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.473750 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="1efcbe59-69a0-46b0-b47b-e7e6fd1502c3" containerName="registry-server" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.473764 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d056acb-0183-4157-a830-fff4cd1dcacf" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.473784 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f939e367-e323-4cac-85d0-55d26d60f4ec" containerName="registry-server" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.474600 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.481396 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.481600 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.481834 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.483986 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.484141 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.484165 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.484475 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.497005 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f"] Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.532646 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.532742 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.532771 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.532803 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.532829 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.532846 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.532879 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.532897 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jjdv\" (UniqueName: \"kubernetes.io/projected/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-kube-api-access-5jjdv\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.532926 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.532981 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.533005 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.634865 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.634945 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.634976 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.634998 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.635020 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.635040 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.635072 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.635089 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jjdv\" (UniqueName: \"kubernetes.io/projected/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-kube-api-access-5jjdv\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.635116 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.635154 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.635172 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.636413 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.640928 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.641790 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.641930 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.642094 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.642243 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.642248 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.642255 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.643942 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.644693 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.654324 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jjdv\" (UniqueName: \"kubernetes.io/projected/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-kube-api-access-5jjdv\") pod \"nova-edpm-deployment-openstack-edpm-ipam-hf42f\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:23 crc kubenswrapper[4839]: I0321 05:03:23.793425 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:03:24 crc kubenswrapper[4839]: W0321 05:03:24.382143 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f8728ca_30ff_41a9_8a48_e3bb7911bcc7.slice/crio-8fc0a22e4f19211488bb9325fd259e084fbec2617916e0087741e1f7592e5580 WatchSource:0}: Error finding container 8fc0a22e4f19211488bb9325fd259e084fbec2617916e0087741e1f7592e5580: Status 404 returned error can't find the container with id 8fc0a22e4f19211488bb9325fd259e084fbec2617916e0087741e1f7592e5580 Mar 21 05:03:24 crc kubenswrapper[4839]: I0321 05:03:24.384100 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f"] Mar 21 05:03:25 crc kubenswrapper[4839]: I0321 05:03:25.393352 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" event={"ID":"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7","Type":"ContainerStarted","Data":"cee45afd6f3370035efcc95b2e9996fd151275bd0576f734a67b97aa4b99e58e"} Mar 21 05:03:25 crc kubenswrapper[4839]: I0321 05:03:25.393778 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" event={"ID":"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7","Type":"ContainerStarted","Data":"8fc0a22e4f19211488bb9325fd259e084fbec2617916e0087741e1f7592e5580"} Mar 21 05:03:25 crc kubenswrapper[4839]: I0321 05:03:25.424071 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" podStartSLOduration=1.999634482 podStartE2EDuration="2.42404876s" podCreationTimestamp="2026-03-21 05:03:23 +0000 UTC" firstStartedPulling="2026-03-21 05:03:24.385063112 +0000 UTC m=+2408.712849788" lastFinishedPulling="2026-03-21 05:03:24.80947738 +0000 UTC m=+2409.137264066" observedRunningTime="2026-03-21 05:03:25.421238401 +0000 UTC m=+2409.749025107" watchObservedRunningTime="2026-03-21 05:03:25.42404876 +0000 UTC m=+2409.751835446" Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.141932 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567824-vx55r"] Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.143784 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567824-vx55r" Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.146768 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.146790 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.147013 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.153887 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567824-vx55r"] Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.312028 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxk6t\" (UniqueName: \"kubernetes.io/projected/0bbefcc3-042e-4587-b172-1a1de0f34dcf-kube-api-access-fxk6t\") pod \"auto-csr-approver-29567824-vx55r\" (UID: \"0bbefcc3-042e-4587-b172-1a1de0f34dcf\") " pod="openshift-infra/auto-csr-approver-29567824-vx55r" Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.413809 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxk6t\" (UniqueName: \"kubernetes.io/projected/0bbefcc3-042e-4587-b172-1a1de0f34dcf-kube-api-access-fxk6t\") pod \"auto-csr-approver-29567824-vx55r\" (UID: \"0bbefcc3-042e-4587-b172-1a1de0f34dcf\") " pod="openshift-infra/auto-csr-approver-29567824-vx55r" Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.433298 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxk6t\" (UniqueName: \"kubernetes.io/projected/0bbefcc3-042e-4587-b172-1a1de0f34dcf-kube-api-access-fxk6t\") pod \"auto-csr-approver-29567824-vx55r\" (UID: \"0bbefcc3-042e-4587-b172-1a1de0f34dcf\") " pod="openshift-infra/auto-csr-approver-29567824-vx55r" Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.464979 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567824-vx55r" Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.902526 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567824-vx55r"] Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.980876 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:04:00 crc kubenswrapper[4839]: I0321 05:04:00.981170 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:04:01 crc kubenswrapper[4839]: I0321 05:04:01.763099 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567824-vx55r" event={"ID":"0bbefcc3-042e-4587-b172-1a1de0f34dcf","Type":"ContainerStarted","Data":"6a274b1ac44ecf31582173215cc84369c49fffb426ab7c2ea960eb33ff65416b"} Mar 21 05:04:02 crc kubenswrapper[4839]: I0321 05:04:02.772105 4839 generic.go:334] "Generic (PLEG): container finished" podID="0bbefcc3-042e-4587-b172-1a1de0f34dcf" containerID="ca1bd38f0e84cbc6abf00446444e014421818ecf4311848a17094cd139ec8ed6" exitCode=0 Mar 21 05:04:02 crc kubenswrapper[4839]: I0321 05:04:02.772170 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567824-vx55r" event={"ID":"0bbefcc3-042e-4587-b172-1a1de0f34dcf","Type":"ContainerDied","Data":"ca1bd38f0e84cbc6abf00446444e014421818ecf4311848a17094cd139ec8ed6"} Mar 21 05:04:04 crc kubenswrapper[4839]: I0321 05:04:04.097499 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567824-vx55r" Mar 21 05:04:04 crc kubenswrapper[4839]: I0321 05:04:04.288029 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxk6t\" (UniqueName: \"kubernetes.io/projected/0bbefcc3-042e-4587-b172-1a1de0f34dcf-kube-api-access-fxk6t\") pod \"0bbefcc3-042e-4587-b172-1a1de0f34dcf\" (UID: \"0bbefcc3-042e-4587-b172-1a1de0f34dcf\") " Mar 21 05:04:04 crc kubenswrapper[4839]: I0321 05:04:04.293855 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bbefcc3-042e-4587-b172-1a1de0f34dcf-kube-api-access-fxk6t" (OuterVolumeSpecName: "kube-api-access-fxk6t") pod "0bbefcc3-042e-4587-b172-1a1de0f34dcf" (UID: "0bbefcc3-042e-4587-b172-1a1de0f34dcf"). InnerVolumeSpecName "kube-api-access-fxk6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:04:04 crc kubenswrapper[4839]: I0321 05:04:04.391030 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxk6t\" (UniqueName: \"kubernetes.io/projected/0bbefcc3-042e-4587-b172-1a1de0f34dcf-kube-api-access-fxk6t\") on node \"crc\" DevicePath \"\"" Mar 21 05:04:04 crc kubenswrapper[4839]: I0321 05:04:04.797460 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567824-vx55r" event={"ID":"0bbefcc3-042e-4587-b172-1a1de0f34dcf","Type":"ContainerDied","Data":"6a274b1ac44ecf31582173215cc84369c49fffb426ab7c2ea960eb33ff65416b"} Mar 21 05:04:04 crc kubenswrapper[4839]: I0321 05:04:04.797783 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a274b1ac44ecf31582173215cc84369c49fffb426ab7c2ea960eb33ff65416b" Mar 21 05:04:04 crc kubenswrapper[4839]: I0321 05:04:04.797538 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567824-vx55r" Mar 21 05:04:05 crc kubenswrapper[4839]: I0321 05:04:05.174928 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567818-qzz8l"] Mar 21 05:04:05 crc kubenswrapper[4839]: I0321 05:04:05.185423 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567818-qzz8l"] Mar 21 05:04:06 crc kubenswrapper[4839]: I0321 05:04:06.463601 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04b644e0-9d17-491d-be8c-359dd9f82604" path="/var/lib/kubelet/pods/04b644e0-9d17-491d-be8c-359dd9f82604/volumes" Mar 21 05:04:30 crc kubenswrapper[4839]: I0321 05:04:30.980611 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:04:30 crc kubenswrapper[4839]: I0321 05:04:30.981455 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:04:55 crc kubenswrapper[4839]: I0321 05:04:55.877146 4839 scope.go:117] "RemoveContainer" containerID="c879183e5f723b0bd5065e25afca82cc281c19704902af0285103b69c58011ac" Mar 21 05:05:00 crc kubenswrapper[4839]: I0321 05:05:00.980209 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:05:00 crc kubenswrapper[4839]: I0321 05:05:00.981013 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:05:00 crc kubenswrapper[4839]: I0321 05:05:00.981089 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 05:05:00 crc kubenswrapper[4839]: I0321 05:05:00.982219 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:05:00 crc kubenswrapper[4839]: I0321 05:05:00.982325 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" gracePeriod=600 Mar 21 05:05:01 crc kubenswrapper[4839]: E0321 05:05:01.106758 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:05:01 crc kubenswrapper[4839]: I0321 05:05:01.297675 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" exitCode=0 Mar 21 05:05:01 crc kubenswrapper[4839]: I0321 05:05:01.297729 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21"} Mar 21 05:05:01 crc kubenswrapper[4839]: I0321 05:05:01.297787 4839 scope.go:117] "RemoveContainer" containerID="f3d8592746d13c0ed95f298f8a3279e3766ac0141ca420e1630c7b54039959ed" Mar 21 05:05:01 crc kubenswrapper[4839]: I0321 05:05:01.298648 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:05:01 crc kubenswrapper[4839]: E0321 05:05:01.299104 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:05:13 crc kubenswrapper[4839]: I0321 05:05:13.453971 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:05:13 crc kubenswrapper[4839]: E0321 05:05:13.454835 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:05:27 crc kubenswrapper[4839]: I0321 05:05:27.452641 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:05:27 crc kubenswrapper[4839]: E0321 05:05:27.453440 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:05:39 crc kubenswrapper[4839]: I0321 05:05:39.453362 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:05:39 crc kubenswrapper[4839]: E0321 05:05:39.455236 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:05:42 crc kubenswrapper[4839]: I0321 05:05:42.658810 4839 generic.go:334] "Generic (PLEG): container finished" podID="3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" containerID="cee45afd6f3370035efcc95b2e9996fd151275bd0576f734a67b97aa4b99e58e" exitCode=0 Mar 21 05:05:42 crc kubenswrapper[4839]: I0321 05:05:42.658875 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" event={"ID":"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7","Type":"ContainerDied","Data":"cee45afd6f3370035efcc95b2e9996fd151275bd0576f734a67b97aa4b99e58e"} Mar 21 05:05:42 crc kubenswrapper[4839]: E0321 05:05:42.982909 4839 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.038384 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.141733 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-combined-ca-bundle\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.141780 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-0\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.141840 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-1\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.141902 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-extra-config-0\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.141967 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-2\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.142042 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-ssh-key-openstack-edpm-ipam\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.142066 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-inventory\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.142097 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-0\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.142120 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-1\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.142162 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-3\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.142239 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jjdv\" (UniqueName: \"kubernetes.io/projected/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-kube-api-access-5jjdv\") pod \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\" (UID: \"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7\") " Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.164390 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-kube-api-access-5jjdv" (OuterVolumeSpecName: "kube-api-access-5jjdv") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "kube-api-access-5jjdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.166036 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.170251 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.170968 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.172266 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.176147 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.179943 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-inventory" (OuterVolumeSpecName: "inventory") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.189533 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.201816 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.204475 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.244317 4839 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.244348 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.244360 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.244371 4839 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.244383 4839 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.244393 4839 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.244403 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jjdv\" (UniqueName: \"kubernetes.io/projected/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-kube-api-access-5jjdv\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.244447 4839 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.244458 4839 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.244470 4839 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.736374 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" (UID: "3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.752820 4839 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3f8728ca-30ff-41a9-8a48-e3bb7911bcc7-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.753435 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" event={"ID":"3f8728ca-30ff-41a9-8a48-e3bb7911bcc7","Type":"ContainerDied","Data":"8fc0a22e4f19211488bb9325fd259e084fbec2617916e0087741e1f7592e5580"} Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.753460 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fc0a22e4f19211488bb9325fd259e084fbec2617916e0087741e1f7592e5580" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.753510 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.792358 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq"] Mar 21 05:05:44 crc kubenswrapper[4839]: E0321 05:05:44.792790 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.792808 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 21 05:05:44 crc kubenswrapper[4839]: E0321 05:05:44.792828 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bbefcc3-042e-4587-b172-1a1de0f34dcf" containerName="oc" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.792839 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bbefcc3-042e-4587-b172-1a1de0f34dcf" containerName="oc" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.793002 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.793017 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bbefcc3-042e-4587-b172-1a1de0f34dcf" containerName="oc" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.793636 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.802267 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.802291 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.802702 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.802349 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.802400 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pxvtf" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.812963 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq"] Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.860595 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.860656 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.860693 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5mvz\" (UniqueName: \"kubernetes.io/projected/4f49b501-bec5-4fe1-89d7-ff3c217ba580-kube-api-access-v5mvz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.860734 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.860797 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.860816 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.860833 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.963822 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.964028 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.964067 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.964105 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.964256 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.964339 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.964413 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5mvz\" (UniqueName: \"kubernetes.io/projected/4f49b501-bec5-4fe1-89d7-ff3c217ba580-kube-api-access-v5mvz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.970723 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.973002 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.977235 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.977268 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.977240 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.983105 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:44 crc kubenswrapper[4839]: I0321 05:05:44.983201 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5mvz\" (UniqueName: \"kubernetes.io/projected/4f49b501-bec5-4fe1-89d7-ff3c217ba580-kube-api-access-v5mvz\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:46 crc kubenswrapper[4839]: I0321 05:05:46.030827 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:05:46 crc kubenswrapper[4839]: I0321 05:05:46.704359 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:05:46 crc kubenswrapper[4839]: I0321 05:05:46.709481 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq"] Mar 21 05:05:47 crc kubenswrapper[4839]: I0321 05:05:47.112350 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" event={"ID":"4f49b501-bec5-4fe1-89d7-ff3c217ba580","Type":"ContainerStarted","Data":"d270c1962df5881492e69c0fae671ca811333a43b6b51f339984de8b79640216"} Mar 21 05:05:48 crc kubenswrapper[4839]: I0321 05:05:48.122463 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" event={"ID":"4f49b501-bec5-4fe1-89d7-ff3c217ba580","Type":"ContainerStarted","Data":"6da4c59622372824b75333d339cb0b5485cc2c5826926b2fb67ddc2f62e7dcd1"} Mar 21 05:05:48 crc kubenswrapper[4839]: I0321 05:05:48.159115 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" podStartSLOduration=3.403742169 podStartE2EDuration="4.159097347s" podCreationTimestamp="2026-03-21 05:05:44 +0000 UTC" firstStartedPulling="2026-03-21 05:05:46.7040993 +0000 UTC m=+2551.031885976" lastFinishedPulling="2026-03-21 05:05:47.459454478 +0000 UTC m=+2551.787241154" observedRunningTime="2026-03-21 05:05:48.1460952 +0000 UTC m=+2552.473881886" watchObservedRunningTime="2026-03-21 05:05:48.159097347 +0000 UTC m=+2552.486884023" Mar 21 05:05:51 crc kubenswrapper[4839]: I0321 05:05:51.453522 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:05:51 crc kubenswrapper[4839]: E0321 05:05:51.454164 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.147919 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567826-x4cdd"] Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.149697 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567826-x4cdd" Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.153885 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.154106 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.154320 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.157564 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567826-x4cdd"] Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.261057 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvjrb\" (UniqueName: \"kubernetes.io/projected/ba8ac9dd-e3e7-4e21-9286-731d926d9580-kube-api-access-wvjrb\") pod \"auto-csr-approver-29567826-x4cdd\" (UID: \"ba8ac9dd-e3e7-4e21-9286-731d926d9580\") " pod="openshift-infra/auto-csr-approver-29567826-x4cdd" Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.362703 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvjrb\" (UniqueName: \"kubernetes.io/projected/ba8ac9dd-e3e7-4e21-9286-731d926d9580-kube-api-access-wvjrb\") pod \"auto-csr-approver-29567826-x4cdd\" (UID: \"ba8ac9dd-e3e7-4e21-9286-731d926d9580\") " pod="openshift-infra/auto-csr-approver-29567826-x4cdd" Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.389012 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvjrb\" (UniqueName: \"kubernetes.io/projected/ba8ac9dd-e3e7-4e21-9286-731d926d9580-kube-api-access-wvjrb\") pod \"auto-csr-approver-29567826-x4cdd\" (UID: \"ba8ac9dd-e3e7-4e21-9286-731d926d9580\") " pod="openshift-infra/auto-csr-approver-29567826-x4cdd" Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.468863 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567826-x4cdd" Mar 21 05:06:00 crc kubenswrapper[4839]: I0321 05:06:00.912656 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567826-x4cdd"] Mar 21 05:06:01 crc kubenswrapper[4839]: I0321 05:06:01.230326 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567826-x4cdd" event={"ID":"ba8ac9dd-e3e7-4e21-9286-731d926d9580","Type":"ContainerStarted","Data":"74c42ba36d7e1a7761c8db53d665fd37163df1d3b96b661a9bbac805b13316d3"} Mar 21 05:06:02 crc kubenswrapper[4839]: I0321 05:06:02.242935 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567826-x4cdd" event={"ID":"ba8ac9dd-e3e7-4e21-9286-731d926d9580","Type":"ContainerStarted","Data":"1be67ef407b003d168ed5f91777a4df15466b61c19dea5b77ca6763eff6dadb2"} Mar 21 05:06:02 crc kubenswrapper[4839]: I0321 05:06:02.263830 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567826-x4cdd" podStartSLOduration=1.415089748 podStartE2EDuration="2.263808999s" podCreationTimestamp="2026-03-21 05:06:00 +0000 UTC" firstStartedPulling="2026-03-21 05:06:00.925015429 +0000 UTC m=+2565.252802105" lastFinishedPulling="2026-03-21 05:06:01.77373469 +0000 UTC m=+2566.101521356" observedRunningTime="2026-03-21 05:06:02.257970405 +0000 UTC m=+2566.585757081" watchObservedRunningTime="2026-03-21 05:06:02.263808999 +0000 UTC m=+2566.591595675" Mar 21 05:06:02 crc kubenswrapper[4839]: I0321 05:06:02.453685 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:06:02 crc kubenswrapper[4839]: E0321 05:06:02.453936 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:06:03 crc kubenswrapper[4839]: I0321 05:06:03.255238 4839 generic.go:334] "Generic (PLEG): container finished" podID="ba8ac9dd-e3e7-4e21-9286-731d926d9580" containerID="1be67ef407b003d168ed5f91777a4df15466b61c19dea5b77ca6763eff6dadb2" exitCode=0 Mar 21 05:06:03 crc kubenswrapper[4839]: I0321 05:06:03.255560 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567826-x4cdd" event={"ID":"ba8ac9dd-e3e7-4e21-9286-731d926d9580","Type":"ContainerDied","Data":"1be67ef407b003d168ed5f91777a4df15466b61c19dea5b77ca6763eff6dadb2"} Mar 21 05:06:04 crc kubenswrapper[4839]: I0321 05:06:04.633269 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567826-x4cdd" Mar 21 05:06:04 crc kubenswrapper[4839]: I0321 05:06:04.748430 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvjrb\" (UniqueName: \"kubernetes.io/projected/ba8ac9dd-e3e7-4e21-9286-731d926d9580-kube-api-access-wvjrb\") pod \"ba8ac9dd-e3e7-4e21-9286-731d926d9580\" (UID: \"ba8ac9dd-e3e7-4e21-9286-731d926d9580\") " Mar 21 05:06:04 crc kubenswrapper[4839]: I0321 05:06:04.760206 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba8ac9dd-e3e7-4e21-9286-731d926d9580-kube-api-access-wvjrb" (OuterVolumeSpecName: "kube-api-access-wvjrb") pod "ba8ac9dd-e3e7-4e21-9286-731d926d9580" (UID: "ba8ac9dd-e3e7-4e21-9286-731d926d9580"). InnerVolumeSpecName "kube-api-access-wvjrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:06:04 crc kubenswrapper[4839]: I0321 05:06:04.851134 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvjrb\" (UniqueName: \"kubernetes.io/projected/ba8ac9dd-e3e7-4e21-9286-731d926d9580-kube-api-access-wvjrb\") on node \"crc\" DevicePath \"\"" Mar 21 05:06:05 crc kubenswrapper[4839]: I0321 05:06:05.271760 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567826-x4cdd" event={"ID":"ba8ac9dd-e3e7-4e21-9286-731d926d9580","Type":"ContainerDied","Data":"74c42ba36d7e1a7761c8db53d665fd37163df1d3b96b661a9bbac805b13316d3"} Mar 21 05:06:05 crc kubenswrapper[4839]: I0321 05:06:05.271809 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567826-x4cdd" Mar 21 05:06:05 crc kubenswrapper[4839]: I0321 05:06:05.271813 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74c42ba36d7e1a7761c8db53d665fd37163df1d3b96b661a9bbac805b13316d3" Mar 21 05:06:05 crc kubenswrapper[4839]: I0321 05:06:05.340293 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567820-drjbh"] Mar 21 05:06:05 crc kubenswrapper[4839]: I0321 05:06:05.350510 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567820-drjbh"] Mar 21 05:06:06 crc kubenswrapper[4839]: I0321 05:06:06.463671 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a082320-155d-4eb3-9779-9c6bb4db2b77" path="/var/lib/kubelet/pods/2a082320-155d-4eb3-9779-9c6bb4db2b77/volumes" Mar 21 05:06:14 crc kubenswrapper[4839]: I0321 05:06:14.812473 4839 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod3f8728ca-30ff-41a9-8a48-e3bb7911bcc7"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod3f8728ca-30ff-41a9-8a48-e3bb7911bcc7] : Timed out while waiting for systemd to remove kubepods-besteffort-pod3f8728ca_30ff_41a9_8a48_e3bb7911bcc7.slice" Mar 21 05:06:14 crc kubenswrapper[4839]: E0321 05:06:14.813084 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod3f8728ca-30ff-41a9-8a48-e3bb7911bcc7] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod3f8728ca-30ff-41a9-8a48-e3bb7911bcc7] : Timed out while waiting for systemd to remove kubepods-besteffort-pod3f8728ca_30ff_41a9_8a48_e3bb7911bcc7.slice" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" podUID="3f8728ca-30ff-41a9-8a48-e3bb7911bcc7" Mar 21 05:06:15 crc kubenswrapper[4839]: I0321 05:06:15.350195 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-hf42f" Mar 21 05:06:17 crc kubenswrapper[4839]: I0321 05:06:17.452783 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:06:17 crc kubenswrapper[4839]: E0321 05:06:17.453367 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:06:28 crc kubenswrapper[4839]: I0321 05:06:28.453413 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:06:28 crc kubenswrapper[4839]: E0321 05:06:28.456249 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:06:43 crc kubenswrapper[4839]: I0321 05:06:43.453510 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:06:43 crc kubenswrapper[4839]: E0321 05:06:43.454514 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:06:55 crc kubenswrapper[4839]: I0321 05:06:55.968836 4839 scope.go:117] "RemoveContainer" containerID="282bb60b5cd122e380e9afc3be1cd2592f307d95be7c684d73af1ea1a65bb700" Mar 21 05:06:58 crc kubenswrapper[4839]: I0321 05:06:58.452744 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:06:58 crc kubenswrapper[4839]: E0321 05:06:58.453544 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:07:10 crc kubenswrapper[4839]: I0321 05:07:10.452251 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:07:10 crc kubenswrapper[4839]: E0321 05:07:10.454395 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:07:23 crc kubenswrapper[4839]: I0321 05:07:23.453228 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:07:23 crc kubenswrapper[4839]: E0321 05:07:23.454009 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:07:34 crc kubenswrapper[4839]: I0321 05:07:34.452990 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:07:34 crc kubenswrapper[4839]: E0321 05:07:34.453686 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:07:47 crc kubenswrapper[4839]: I0321 05:07:47.454113 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:07:47 crc kubenswrapper[4839]: E0321 05:07:47.456094 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:07:59 crc kubenswrapper[4839]: I0321 05:07:59.453040 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:07:59 crc kubenswrapper[4839]: E0321 05:07:59.453834 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.150079 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567828-qnptz"] Mar 21 05:08:00 crc kubenswrapper[4839]: E0321 05:08:00.150610 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8ac9dd-e3e7-4e21-9286-731d926d9580" containerName="oc" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.150627 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8ac9dd-e3e7-4e21-9286-731d926d9580" containerName="oc" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.150891 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8ac9dd-e3e7-4e21-9286-731d926d9580" containerName="oc" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.151669 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567828-qnptz" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.153982 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.153992 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.156480 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.160516 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567828-qnptz"] Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.219462 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwlvj\" (UniqueName: \"kubernetes.io/projected/e550e6e6-fc33-4703-b8db-6cd8169ebc7f-kube-api-access-hwlvj\") pod \"auto-csr-approver-29567828-qnptz\" (UID: \"e550e6e6-fc33-4703-b8db-6cd8169ebc7f\") " pod="openshift-infra/auto-csr-approver-29567828-qnptz" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.320870 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwlvj\" (UniqueName: \"kubernetes.io/projected/e550e6e6-fc33-4703-b8db-6cd8169ebc7f-kube-api-access-hwlvj\") pod \"auto-csr-approver-29567828-qnptz\" (UID: \"e550e6e6-fc33-4703-b8db-6cd8169ebc7f\") " pod="openshift-infra/auto-csr-approver-29567828-qnptz" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.339887 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwlvj\" (UniqueName: \"kubernetes.io/projected/e550e6e6-fc33-4703-b8db-6cd8169ebc7f-kube-api-access-hwlvj\") pod \"auto-csr-approver-29567828-qnptz\" (UID: \"e550e6e6-fc33-4703-b8db-6cd8169ebc7f\") " pod="openshift-infra/auto-csr-approver-29567828-qnptz" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.473436 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567828-qnptz" Mar 21 05:08:00 crc kubenswrapper[4839]: I0321 05:08:00.917927 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567828-qnptz"] Mar 21 05:08:01 crc kubenswrapper[4839]: I0321 05:08:01.426597 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567828-qnptz" event={"ID":"e550e6e6-fc33-4703-b8db-6cd8169ebc7f","Type":"ContainerStarted","Data":"6fff64e5fc3a56c4210f99621355bab4012625eb139003829e1ddce0fe2e1941"} Mar 21 05:08:03 crc kubenswrapper[4839]: I0321 05:08:03.453694 4839 generic.go:334] "Generic (PLEG): container finished" podID="e550e6e6-fc33-4703-b8db-6cd8169ebc7f" containerID="4eee1ce2fe1133d7d9ea3d85cfff635c2acc34be75ad8890d23c93b28ff12298" exitCode=0 Mar 21 05:08:03 crc kubenswrapper[4839]: I0321 05:08:03.453773 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567828-qnptz" event={"ID":"e550e6e6-fc33-4703-b8db-6cd8169ebc7f","Type":"ContainerDied","Data":"4eee1ce2fe1133d7d9ea3d85cfff635c2acc34be75ad8890d23c93b28ff12298"} Mar 21 05:08:04 crc kubenswrapper[4839]: I0321 05:08:04.793622 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567828-qnptz" Mar 21 05:08:04 crc kubenswrapper[4839]: I0321 05:08:04.938558 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwlvj\" (UniqueName: \"kubernetes.io/projected/e550e6e6-fc33-4703-b8db-6cd8169ebc7f-kube-api-access-hwlvj\") pod \"e550e6e6-fc33-4703-b8db-6cd8169ebc7f\" (UID: \"e550e6e6-fc33-4703-b8db-6cd8169ebc7f\") " Mar 21 05:08:04 crc kubenswrapper[4839]: I0321 05:08:04.944592 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e550e6e6-fc33-4703-b8db-6cd8169ebc7f-kube-api-access-hwlvj" (OuterVolumeSpecName: "kube-api-access-hwlvj") pod "e550e6e6-fc33-4703-b8db-6cd8169ebc7f" (UID: "e550e6e6-fc33-4703-b8db-6cd8169ebc7f"). InnerVolumeSpecName "kube-api-access-hwlvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:05 crc kubenswrapper[4839]: I0321 05:08:05.041425 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwlvj\" (UniqueName: \"kubernetes.io/projected/e550e6e6-fc33-4703-b8db-6cd8169ebc7f-kube-api-access-hwlvj\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:05 crc kubenswrapper[4839]: I0321 05:08:05.475454 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567828-qnptz" event={"ID":"e550e6e6-fc33-4703-b8db-6cd8169ebc7f","Type":"ContainerDied","Data":"6fff64e5fc3a56c4210f99621355bab4012625eb139003829e1ddce0fe2e1941"} Mar 21 05:08:05 crc kubenswrapper[4839]: I0321 05:08:05.475492 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fff64e5fc3a56c4210f99621355bab4012625eb139003829e1ddce0fe2e1941" Mar 21 05:08:05 crc kubenswrapper[4839]: I0321 05:08:05.475496 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567828-qnptz" Mar 21 05:08:05 crc kubenswrapper[4839]: I0321 05:08:05.865215 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567822-666xq"] Mar 21 05:08:05 crc kubenswrapper[4839]: I0321 05:08:05.873426 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567822-666xq"] Mar 21 05:08:06 crc kubenswrapper[4839]: I0321 05:08:06.462654 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5246ade9-02c7-4a6c-b903-f556b6405d03" path="/var/lib/kubelet/pods/5246ade9-02c7-4a6c-b903-f556b6405d03/volumes" Mar 21 05:08:07 crc kubenswrapper[4839]: I0321 05:08:07.494140 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" event={"ID":"4f49b501-bec5-4fe1-89d7-ff3c217ba580","Type":"ContainerDied","Data":"6da4c59622372824b75333d339cb0b5485cc2c5826926b2fb67ddc2f62e7dcd1"} Mar 21 05:08:07 crc kubenswrapper[4839]: I0321 05:08:07.494189 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f49b501-bec5-4fe1-89d7-ff3c217ba580" containerID="6da4c59622372824b75333d339cb0b5485cc2c5826926b2fb67ddc2f62e7dcd1" exitCode=0 Mar 21 05:08:08 crc kubenswrapper[4839]: I0321 05:08:08.946699 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.119890 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-inventory\") pod \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.119990 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ssh-key-openstack-edpm-ipam\") pod \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.120054 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5mvz\" (UniqueName: \"kubernetes.io/projected/4f49b501-bec5-4fe1-89d7-ff3c217ba580-kube-api-access-v5mvz\") pod \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.120112 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-1\") pod \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.120140 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-2\") pod \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.120217 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-0\") pod \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.120267 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-telemetry-combined-ca-bundle\") pod \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\" (UID: \"4f49b501-bec5-4fe1-89d7-ff3c217ba580\") " Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.129817 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "4f49b501-bec5-4fe1-89d7-ff3c217ba580" (UID: "4f49b501-bec5-4fe1-89d7-ff3c217ba580"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.131901 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f49b501-bec5-4fe1-89d7-ff3c217ba580-kube-api-access-v5mvz" (OuterVolumeSpecName: "kube-api-access-v5mvz") pod "4f49b501-bec5-4fe1-89d7-ff3c217ba580" (UID: "4f49b501-bec5-4fe1-89d7-ff3c217ba580"). InnerVolumeSpecName "kube-api-access-v5mvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.148120 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "4f49b501-bec5-4fe1-89d7-ff3c217ba580" (UID: "4f49b501-bec5-4fe1-89d7-ff3c217ba580"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.149835 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "4f49b501-bec5-4fe1-89d7-ff3c217ba580" (UID: "4f49b501-bec5-4fe1-89d7-ff3c217ba580"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.150855 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "4f49b501-bec5-4fe1-89d7-ff3c217ba580" (UID: "4f49b501-bec5-4fe1-89d7-ff3c217ba580"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.152770 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4f49b501-bec5-4fe1-89d7-ff3c217ba580" (UID: "4f49b501-bec5-4fe1-89d7-ff3c217ba580"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.152813 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-inventory" (OuterVolumeSpecName: "inventory") pod "4f49b501-bec5-4fe1-89d7-ff3c217ba580" (UID: "4f49b501-bec5-4fe1-89d7-ff3c217ba580"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.222093 4839 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.222129 4839 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-inventory\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.222861 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.222886 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5mvz\" (UniqueName: \"kubernetes.io/projected/4f49b501-bec5-4fe1-89d7-ff3c217ba580-kube-api-access-v5mvz\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.222898 4839 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.222933 4839 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.222946 4839 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4f49b501-bec5-4fe1-89d7-ff3c217ba580-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.512024 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" event={"ID":"4f49b501-bec5-4fe1-89d7-ff3c217ba580","Type":"ContainerDied","Data":"d270c1962df5881492e69c0fae671ca811333a43b6b51f339984de8b79640216"} Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.512109 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d270c1962df5881492e69c0fae671ca811333a43b6b51f339984de8b79640216" Mar 21 05:08:09 crc kubenswrapper[4839]: I0321 05:08:09.512119 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq" Mar 21 05:08:14 crc kubenswrapper[4839]: I0321 05:08:14.453175 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:08:14 crc kubenswrapper[4839]: E0321 05:08:14.454059 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:08:28 crc kubenswrapper[4839]: I0321 05:08:28.453439 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:08:28 crc kubenswrapper[4839]: E0321 05:08:28.454315 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:08:42 crc kubenswrapper[4839]: I0321 05:08:42.453028 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:08:42 crc kubenswrapper[4839]: E0321 05:08:42.453946 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.523440 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 21 05:08:54 crc kubenswrapper[4839]: E0321 05:08:54.524457 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f49b501-bec5-4fe1-89d7-ff3c217ba580" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.524476 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f49b501-bec5-4fe1-89d7-ff3c217ba580" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 21 05:08:54 crc kubenswrapper[4839]: E0321 05:08:54.524493 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e550e6e6-fc33-4703-b8db-6cd8169ebc7f" containerName="oc" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.524500 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e550e6e6-fc33-4703-b8db-6cd8169ebc7f" containerName="oc" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.524739 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e550e6e6-fc33-4703-b8db-6cd8169ebc7f" containerName="oc" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.524756 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f49b501-bec5-4fe1-89d7-ff3c217ba580" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.525452 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.527764 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.527891 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.528947 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-6v5zd" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.529297 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.537307 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.552255 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.552317 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.552364 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-config-data\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.654105 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.654240 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.654268 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-config-data\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.654303 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.654901 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.654982 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x84s5\" (UniqueName: \"kubernetes.io/projected/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-kube-api-access-x84s5\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.655067 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.655123 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.655449 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.657510 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.657647 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-config-data\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.665435 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.757408 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x84s5\" (UniqueName: \"kubernetes.io/projected/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-kube-api-access-x84s5\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.757937 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.758084 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.758151 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.758269 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.758284 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.758698 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.758853 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.758934 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.763173 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.764508 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.782311 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x84s5\" (UniqueName: \"kubernetes.io/projected/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-kube-api-access-x84s5\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.786208 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " pod="openstack/tempest-tests-tempest" Mar 21 05:08:54 crc kubenswrapper[4839]: I0321 05:08:54.862418 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 21 05:08:55 crc kubenswrapper[4839]: I0321 05:08:55.337724 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 21 05:08:55 crc kubenswrapper[4839]: I0321 05:08:55.927890 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3","Type":"ContainerStarted","Data":"de55efea3ef0459f6ae11516d74916a8089be737bb24fdba8c8fcffa3719ebe6"} Mar 21 05:08:56 crc kubenswrapper[4839]: I0321 05:08:56.077989 4839 scope.go:117] "RemoveContainer" containerID="4b22a92b0ab8fcff90ca92aa57b5aa47ae8ec5ba62184c302116eb1d72e13a3d" Mar 21 05:08:56 crc kubenswrapper[4839]: I0321 05:08:56.125956 4839 scope.go:117] "RemoveContainer" containerID="1efc52951d43a245177e4dff7cfd1e3426fa930b28133acb294aea6743e70139" Mar 21 05:08:56 crc kubenswrapper[4839]: I0321 05:08:56.166490 4839 scope.go:117] "RemoveContainer" containerID="292e8e6107a41a041900da65d2d65595f3753b426bbb8c2a06a65273c04fd6b1" Mar 21 05:08:56 crc kubenswrapper[4839]: I0321 05:08:56.251436 4839 scope.go:117] "RemoveContainer" containerID="b9de2139979cd92ed9f8ddab30d1a42bd7d67a863ada145cf6cbe71703537956" Mar 21 05:08:57 crc kubenswrapper[4839]: I0321 05:08:57.453217 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:08:57 crc kubenswrapper[4839]: E0321 05:08:57.453785 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:09:08 crc kubenswrapper[4839]: I0321 05:09:08.452704 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:09:08 crc kubenswrapper[4839]: E0321 05:09:08.453718 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:09:22 crc kubenswrapper[4839]: I0321 05:09:22.453453 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:09:22 crc kubenswrapper[4839]: E0321 05:09:22.454648 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:09:23 crc kubenswrapper[4839]: E0321 05:09:23.562861 4839 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 21 05:09:23 crc kubenswrapper[4839]: E0321 05:09:23.563379 4839 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x84s5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 21 05:09:23 crc kubenswrapper[4839]: E0321 05:09:23.564542 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" Mar 21 05:09:24 crc kubenswrapper[4839]: E0321 05:09:24.177224 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" Mar 21 05:09:36 crc kubenswrapper[4839]: I0321 05:09:36.877081 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 21 05:09:37 crc kubenswrapper[4839]: I0321 05:09:37.453072 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:09:37 crc kubenswrapper[4839]: E0321 05:09:37.453707 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.017099 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qslfs"] Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.021127 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.028282 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qslfs"] Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.123356 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmx52\" (UniqueName: \"kubernetes.io/projected/9724e408-6086-45cd-961d-5d5504f15791-kube-api-access-mmx52\") pod \"redhat-operators-qslfs\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.123448 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-catalog-content\") pod \"redhat-operators-qslfs\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.123503 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-utilities\") pod \"redhat-operators-qslfs\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.225141 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-catalog-content\") pod \"redhat-operators-qslfs\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.226261 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-catalog-content\") pod \"redhat-operators-qslfs\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.226441 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-utilities\") pod \"redhat-operators-qslfs\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.226693 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmx52\" (UniqueName: \"kubernetes.io/projected/9724e408-6086-45cd-961d-5d5504f15791-kube-api-access-mmx52\") pod \"redhat-operators-qslfs\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.226709 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-utilities\") pod \"redhat-operators-qslfs\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.245055 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmx52\" (UniqueName: \"kubernetes.io/projected/9724e408-6086-45cd-961d-5d5504f15791-kube-api-access-mmx52\") pod \"redhat-operators-qslfs\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.331697 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3","Type":"ContainerStarted","Data":"0176599a8a2d6c5f1b857f924207691c2463c8d61ed3270470c8fc3d29535c3b"} Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.340696 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.367073 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.8323489139999998 podStartE2EDuration="45.367055123s" podCreationTimestamp="2026-03-21 05:08:53 +0000 UTC" firstStartedPulling="2026-03-21 05:08:55.339294053 +0000 UTC m=+2739.667080729" lastFinishedPulling="2026-03-21 05:09:36.874000262 +0000 UTC m=+2781.201786938" observedRunningTime="2026-03-21 05:09:38.356829495 +0000 UTC m=+2782.684616171" watchObservedRunningTime="2026-03-21 05:09:38.367055123 +0000 UTC m=+2782.694841799" Mar 21 05:09:38 crc kubenswrapper[4839]: W0321 05:09:38.836165 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9724e408_6086_45cd_961d_5d5504f15791.slice/crio-3ee0994c45cc96d08771860e048f51ded7dac2e5b506ab578b474e4ea7342e27 WatchSource:0}: Error finding container 3ee0994c45cc96d08771860e048f51ded7dac2e5b506ab578b474e4ea7342e27: Status 404 returned error can't find the container with id 3ee0994c45cc96d08771860e048f51ded7dac2e5b506ab578b474e4ea7342e27 Mar 21 05:09:38 crc kubenswrapper[4839]: I0321 05:09:38.847186 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qslfs"] Mar 21 05:09:39 crc kubenswrapper[4839]: I0321 05:09:39.344037 4839 generic.go:334] "Generic (PLEG): container finished" podID="9724e408-6086-45cd-961d-5d5504f15791" containerID="f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237" exitCode=0 Mar 21 05:09:39 crc kubenswrapper[4839]: I0321 05:09:39.344266 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qslfs" event={"ID":"9724e408-6086-45cd-961d-5d5504f15791","Type":"ContainerDied","Data":"f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237"} Mar 21 05:09:39 crc kubenswrapper[4839]: I0321 05:09:39.344358 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qslfs" event={"ID":"9724e408-6086-45cd-961d-5d5504f15791","Type":"ContainerStarted","Data":"3ee0994c45cc96d08771860e048f51ded7dac2e5b506ab578b474e4ea7342e27"} Mar 21 05:09:40 crc kubenswrapper[4839]: I0321 05:09:40.357171 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qslfs" event={"ID":"9724e408-6086-45cd-961d-5d5504f15791","Type":"ContainerStarted","Data":"f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8"} Mar 21 05:09:42 crc kubenswrapper[4839]: I0321 05:09:42.381973 4839 generic.go:334] "Generic (PLEG): container finished" podID="9724e408-6086-45cd-961d-5d5504f15791" containerID="f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8" exitCode=0 Mar 21 05:09:42 crc kubenswrapper[4839]: I0321 05:09:42.382046 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qslfs" event={"ID":"9724e408-6086-45cd-961d-5d5504f15791","Type":"ContainerDied","Data":"f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8"} Mar 21 05:09:43 crc kubenswrapper[4839]: I0321 05:09:43.394198 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qslfs" event={"ID":"9724e408-6086-45cd-961d-5d5504f15791","Type":"ContainerStarted","Data":"a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4"} Mar 21 05:09:43 crc kubenswrapper[4839]: I0321 05:09:43.425301 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qslfs" podStartSLOduration=2.961230426 podStartE2EDuration="6.425280885s" podCreationTimestamp="2026-03-21 05:09:37 +0000 UTC" firstStartedPulling="2026-03-21 05:09:39.345998298 +0000 UTC m=+2783.673784974" lastFinishedPulling="2026-03-21 05:09:42.810048757 +0000 UTC m=+2787.137835433" observedRunningTime="2026-03-21 05:09:43.421466888 +0000 UTC m=+2787.749253564" watchObservedRunningTime="2026-03-21 05:09:43.425280885 +0000 UTC m=+2787.753067561" Mar 21 05:09:48 crc kubenswrapper[4839]: I0321 05:09:48.341015 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:48 crc kubenswrapper[4839]: I0321 05:09:48.341478 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:09:49 crc kubenswrapper[4839]: I0321 05:09:49.399893 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qslfs" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="registry-server" probeResult="failure" output=< Mar 21 05:09:49 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 05:09:49 crc kubenswrapper[4839]: > Mar 21 05:09:51 crc kubenswrapper[4839]: I0321 05:09:51.453204 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:09:51 crc kubenswrapper[4839]: E0321 05:09:51.453781 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:09:59 crc kubenswrapper[4839]: I0321 05:09:59.389726 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qslfs" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="registry-server" probeResult="failure" output=< Mar 21 05:09:59 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 05:09:59 crc kubenswrapper[4839]: > Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.150826 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567830-8w2nt"] Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.152708 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567830-8w2nt" Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.154846 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.157972 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.158269 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.159979 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567830-8w2nt"] Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.208387 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl7qr\" (UniqueName: \"kubernetes.io/projected/aec2b9da-f24b-47bb-95cc-2903624e2eb1-kube-api-access-kl7qr\") pod \"auto-csr-approver-29567830-8w2nt\" (UID: \"aec2b9da-f24b-47bb-95cc-2903624e2eb1\") " pod="openshift-infra/auto-csr-approver-29567830-8w2nt" Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.310256 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl7qr\" (UniqueName: \"kubernetes.io/projected/aec2b9da-f24b-47bb-95cc-2903624e2eb1-kube-api-access-kl7qr\") pod \"auto-csr-approver-29567830-8w2nt\" (UID: \"aec2b9da-f24b-47bb-95cc-2903624e2eb1\") " pod="openshift-infra/auto-csr-approver-29567830-8w2nt" Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.330467 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl7qr\" (UniqueName: \"kubernetes.io/projected/aec2b9da-f24b-47bb-95cc-2903624e2eb1-kube-api-access-kl7qr\") pod \"auto-csr-approver-29567830-8w2nt\" (UID: \"aec2b9da-f24b-47bb-95cc-2903624e2eb1\") " pod="openshift-infra/auto-csr-approver-29567830-8w2nt" Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.475543 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567830-8w2nt" Mar 21 05:10:00 crc kubenswrapper[4839]: I0321 05:10:00.998146 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567830-8w2nt"] Mar 21 05:10:01 crc kubenswrapper[4839]: I0321 05:10:01.731383 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567830-8w2nt" event={"ID":"aec2b9da-f24b-47bb-95cc-2903624e2eb1","Type":"ContainerStarted","Data":"6995a7d62b6effa45d3302d6b5288dcf4083f2bb7e7241635732d04a65452302"} Mar 21 05:10:03 crc kubenswrapper[4839]: I0321 05:10:03.453142 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:10:03 crc kubenswrapper[4839]: I0321 05:10:03.747244 4839 generic.go:334] "Generic (PLEG): container finished" podID="aec2b9da-f24b-47bb-95cc-2903624e2eb1" containerID="3edbd65acb10c2e93822fc30bbb912980fc1145d300030a3f42acd4b77e841c2" exitCode=0 Mar 21 05:10:03 crc kubenswrapper[4839]: I0321 05:10:03.747328 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567830-8w2nt" event={"ID":"aec2b9da-f24b-47bb-95cc-2903624e2eb1","Type":"ContainerDied","Data":"3edbd65acb10c2e93822fc30bbb912980fc1145d300030a3f42acd4b77e841c2"} Mar 21 05:10:03 crc kubenswrapper[4839]: I0321 05:10:03.750316 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"b3b7e174e088400e045d778d09a66442ae3aba7106eba8585b97c2b9d3b1ab21"} Mar 21 05:10:05 crc kubenswrapper[4839]: I0321 05:10:05.208032 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567830-8w2nt" Mar 21 05:10:05 crc kubenswrapper[4839]: I0321 05:10:05.323277 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl7qr\" (UniqueName: \"kubernetes.io/projected/aec2b9da-f24b-47bb-95cc-2903624e2eb1-kube-api-access-kl7qr\") pod \"aec2b9da-f24b-47bb-95cc-2903624e2eb1\" (UID: \"aec2b9da-f24b-47bb-95cc-2903624e2eb1\") " Mar 21 05:10:05 crc kubenswrapper[4839]: I0321 05:10:05.336779 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aec2b9da-f24b-47bb-95cc-2903624e2eb1-kube-api-access-kl7qr" (OuterVolumeSpecName: "kube-api-access-kl7qr") pod "aec2b9da-f24b-47bb-95cc-2903624e2eb1" (UID: "aec2b9da-f24b-47bb-95cc-2903624e2eb1"). InnerVolumeSpecName "kube-api-access-kl7qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:05 crc kubenswrapper[4839]: I0321 05:10:05.425458 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl7qr\" (UniqueName: \"kubernetes.io/projected/aec2b9da-f24b-47bb-95cc-2903624e2eb1-kube-api-access-kl7qr\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:05 crc kubenswrapper[4839]: I0321 05:10:05.775894 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567830-8w2nt" event={"ID":"aec2b9da-f24b-47bb-95cc-2903624e2eb1","Type":"ContainerDied","Data":"6995a7d62b6effa45d3302d6b5288dcf4083f2bb7e7241635732d04a65452302"} Mar 21 05:10:05 crc kubenswrapper[4839]: I0321 05:10:05.779438 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6995a7d62b6effa45d3302d6b5288dcf4083f2bb7e7241635732d04a65452302" Mar 21 05:10:05 crc kubenswrapper[4839]: I0321 05:10:05.778987 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567830-8w2nt" Mar 21 05:10:06 crc kubenswrapper[4839]: I0321 05:10:06.282392 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567824-vx55r"] Mar 21 05:10:06 crc kubenswrapper[4839]: I0321 05:10:06.290296 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567824-vx55r"] Mar 21 05:10:06 crc kubenswrapper[4839]: I0321 05:10:06.465484 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bbefcc3-042e-4587-b172-1a1de0f34dcf" path="/var/lib/kubelet/pods/0bbefcc3-042e-4587-b172-1a1de0f34dcf/volumes" Mar 21 05:10:09 crc kubenswrapper[4839]: I0321 05:10:09.397810 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qslfs" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="registry-server" probeResult="failure" output=< Mar 21 05:10:09 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 05:10:09 crc kubenswrapper[4839]: > Mar 21 05:10:18 crc kubenswrapper[4839]: I0321 05:10:18.382310 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:10:18 crc kubenswrapper[4839]: I0321 05:10:18.429143 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:10:18 crc kubenswrapper[4839]: I0321 05:10:18.616319 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qslfs"] Mar 21 05:10:19 crc kubenswrapper[4839]: I0321 05:10:19.915475 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qslfs" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="registry-server" containerID="cri-o://a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4" gracePeriod=2 Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:20.905627 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:20.934462 4839 generic.go:334] "Generic (PLEG): container finished" podID="9724e408-6086-45cd-961d-5d5504f15791" containerID="a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4" exitCode=0 Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:20.934505 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qslfs" event={"ID":"9724e408-6086-45cd-961d-5d5504f15791","Type":"ContainerDied","Data":"a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4"} Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:20.934546 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qslfs" event={"ID":"9724e408-6086-45cd-961d-5d5504f15791","Type":"ContainerDied","Data":"3ee0994c45cc96d08771860e048f51ded7dac2e5b506ab578b474e4ea7342e27"} Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:20.934584 4839 scope.go:117] "RemoveContainer" containerID="a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:20.934670 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qslfs" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:20.956069 4839 scope.go:117] "RemoveContainer" containerID="f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:20.980723 4839 scope.go:117] "RemoveContainer" containerID="f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.018286 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-catalog-content\") pod \"9724e408-6086-45cd-961d-5d5504f15791\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.018384 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmx52\" (UniqueName: \"kubernetes.io/projected/9724e408-6086-45cd-961d-5d5504f15791-kube-api-access-mmx52\") pod \"9724e408-6086-45cd-961d-5d5504f15791\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.018435 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-utilities\") pod \"9724e408-6086-45cd-961d-5d5504f15791\" (UID: \"9724e408-6086-45cd-961d-5d5504f15791\") " Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.019494 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-utilities" (OuterVolumeSpecName: "utilities") pod "9724e408-6086-45cd-961d-5d5504f15791" (UID: "9724e408-6086-45cd-961d-5d5504f15791"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.022414 4839 scope.go:117] "RemoveContainer" containerID="a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4" Mar 21 05:10:21 crc kubenswrapper[4839]: E0321 05:10:21.023616 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4\": container with ID starting with a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4 not found: ID does not exist" containerID="a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.023681 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4"} err="failed to get container status \"a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4\": rpc error: code = NotFound desc = could not find container \"a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4\": container with ID starting with a93bb86a5e91c867993ccba2e62a0a47cc3e5af5df9b68eef0f88d6d164f97d4 not found: ID does not exist" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.023716 4839 scope.go:117] "RemoveContainer" containerID="f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8" Mar 21 05:10:21 crc kubenswrapper[4839]: E0321 05:10:21.024128 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8\": container with ID starting with f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8 not found: ID does not exist" containerID="f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.024155 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8"} err="failed to get container status \"f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8\": rpc error: code = NotFound desc = could not find container \"f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8\": container with ID starting with f4222b0bdc17d02a76cee6d0d46fc25a35d55bbc936412983c70c150951d1ca8 not found: ID does not exist" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.024175 4839 scope.go:117] "RemoveContainer" containerID="f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237" Mar 21 05:10:21 crc kubenswrapper[4839]: E0321 05:10:21.024595 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237\": container with ID starting with f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237 not found: ID does not exist" containerID="f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.024639 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237"} err="failed to get container status \"f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237\": rpc error: code = NotFound desc = could not find container \"f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237\": container with ID starting with f416b63465291b1b0afd3a134a0a816800ac0b064ea726d9921fdf7a33ec3237 not found: ID does not exist" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.027892 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9724e408-6086-45cd-961d-5d5504f15791-kube-api-access-mmx52" (OuterVolumeSpecName: "kube-api-access-mmx52") pod "9724e408-6086-45cd-961d-5d5504f15791" (UID: "9724e408-6086-45cd-961d-5d5504f15791"). InnerVolumeSpecName "kube-api-access-mmx52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.120500 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmx52\" (UniqueName: \"kubernetes.io/projected/9724e408-6086-45cd-961d-5d5504f15791-kube-api-access-mmx52\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.120525 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.154527 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9724e408-6086-45cd-961d-5d5504f15791" (UID: "9724e408-6086-45cd-961d-5d5504f15791"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.222354 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9724e408-6086-45cd-961d-5d5504f15791-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.276751 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qslfs"] Mar 21 05:10:21 crc kubenswrapper[4839]: I0321 05:10:21.285721 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qslfs"] Mar 21 05:10:22 crc kubenswrapper[4839]: I0321 05:10:22.463610 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9724e408-6086-45cd-961d-5d5504f15791" path="/var/lib/kubelet/pods/9724e408-6086-45cd-961d-5d5504f15791/volumes" Mar 21 05:10:56 crc kubenswrapper[4839]: I0321 05:10:56.350098 4839 scope.go:117] "RemoveContainer" containerID="ca1bd38f0e84cbc6abf00446444e014421818ecf4311848a17094cd139ec8ed6" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.441788 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fh65n"] Mar 21 05:11:18 crc kubenswrapper[4839]: E0321 05:11:18.443196 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="registry-server" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.443214 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="registry-server" Mar 21 05:11:18 crc kubenswrapper[4839]: E0321 05:11:18.443241 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="extract-utilities" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.443250 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="extract-utilities" Mar 21 05:11:18 crc kubenswrapper[4839]: E0321 05:11:18.443258 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="extract-content" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.443267 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="extract-content" Mar 21 05:11:18 crc kubenswrapper[4839]: E0321 05:11:18.443289 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aec2b9da-f24b-47bb-95cc-2903624e2eb1" containerName="oc" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.443297 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec2b9da-f24b-47bb-95cc-2903624e2eb1" containerName="oc" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.443491 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="9724e408-6086-45cd-961d-5d5504f15791" containerName="registry-server" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.443513 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="aec2b9da-f24b-47bb-95cc-2903624e2eb1" containerName="oc" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.444967 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.466976 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fh65n"] Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.543945 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-catalog-content\") pod \"community-operators-fh65n\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.544088 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frdgb\" (UniqueName: \"kubernetes.io/projected/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-kube-api-access-frdgb\") pod \"community-operators-fh65n\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.544148 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-utilities\") pod \"community-operators-fh65n\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.645561 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-catalog-content\") pod \"community-operators-fh65n\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.645700 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frdgb\" (UniqueName: \"kubernetes.io/projected/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-kube-api-access-frdgb\") pod \"community-operators-fh65n\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.645758 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-utilities\") pod \"community-operators-fh65n\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.646107 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-catalog-content\") pod \"community-operators-fh65n\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.646140 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-utilities\") pod \"community-operators-fh65n\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.670877 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frdgb\" (UniqueName: \"kubernetes.io/projected/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-kube-api-access-frdgb\") pod \"community-operators-fh65n\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:18 crc kubenswrapper[4839]: I0321 05:11:18.766916 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:19 crc kubenswrapper[4839]: I0321 05:11:19.284341 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fh65n"] Mar 21 05:11:19 crc kubenswrapper[4839]: I0321 05:11:19.466328 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fh65n" event={"ID":"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e","Type":"ContainerStarted","Data":"cde0425763174457626a3683ed0e3ac17018ee383a5638dbacee306c171373be"} Mar 21 05:11:20 crc kubenswrapper[4839]: I0321 05:11:20.479664 4839 generic.go:334] "Generic (PLEG): container finished" podID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerID="68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883" exitCode=0 Mar 21 05:11:20 crc kubenswrapper[4839]: I0321 05:11:20.479913 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fh65n" event={"ID":"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e","Type":"ContainerDied","Data":"68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883"} Mar 21 05:11:20 crc kubenswrapper[4839]: I0321 05:11:20.484522 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:11:21 crc kubenswrapper[4839]: I0321 05:11:21.490221 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fh65n" event={"ID":"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e","Type":"ContainerStarted","Data":"bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5"} Mar 21 05:11:22 crc kubenswrapper[4839]: I0321 05:11:22.501965 4839 generic.go:334] "Generic (PLEG): container finished" podID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerID="bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5" exitCode=0 Mar 21 05:11:22 crc kubenswrapper[4839]: I0321 05:11:22.502061 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fh65n" event={"ID":"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e","Type":"ContainerDied","Data":"bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5"} Mar 21 05:11:22 crc kubenswrapper[4839]: I0321 05:11:22.502425 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fh65n" event={"ID":"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e","Type":"ContainerStarted","Data":"2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a"} Mar 21 05:11:22 crc kubenswrapper[4839]: I0321 05:11:22.532141 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fh65n" podStartSLOduration=3.1604547849999998 podStartE2EDuration="4.532110564s" podCreationTimestamp="2026-03-21 05:11:18 +0000 UTC" firstStartedPulling="2026-03-21 05:11:20.484161495 +0000 UTC m=+2884.811948171" lastFinishedPulling="2026-03-21 05:11:21.855817274 +0000 UTC m=+2886.183603950" observedRunningTime="2026-03-21 05:11:22.521211996 +0000 UTC m=+2886.848998682" watchObservedRunningTime="2026-03-21 05:11:22.532110564 +0000 UTC m=+2886.859897240" Mar 21 05:11:28 crc kubenswrapper[4839]: I0321 05:11:28.767318 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:28 crc kubenswrapper[4839]: I0321 05:11:28.767848 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:28 crc kubenswrapper[4839]: I0321 05:11:28.817285 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:29 crc kubenswrapper[4839]: I0321 05:11:29.764453 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:29 crc kubenswrapper[4839]: I0321 05:11:29.811458 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fh65n"] Mar 21 05:11:31 crc kubenswrapper[4839]: I0321 05:11:31.727738 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fh65n" podUID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerName="registry-server" containerID="cri-o://2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a" gracePeriod=2 Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.183985 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.208338 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frdgb\" (UniqueName: \"kubernetes.io/projected/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-kube-api-access-frdgb\") pod \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.208758 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-utilities\") pod \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.208795 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-catalog-content\") pod \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\" (UID: \"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e\") " Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.211306 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-utilities" (OuterVolumeSpecName: "utilities") pod "b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" (UID: "b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.217114 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-kube-api-access-frdgb" (OuterVolumeSpecName: "kube-api-access-frdgb") pod "b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" (UID: "b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e"). InnerVolumeSpecName "kube-api-access-frdgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.310880 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frdgb\" (UniqueName: \"kubernetes.io/projected/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-kube-api-access-frdgb\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.310912 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.738450 4839 generic.go:334] "Generic (PLEG): container finished" podID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerID="2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a" exitCode=0 Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.738495 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fh65n" event={"ID":"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e","Type":"ContainerDied","Data":"2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a"} Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.738521 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fh65n" event={"ID":"b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e","Type":"ContainerDied","Data":"cde0425763174457626a3683ed0e3ac17018ee383a5638dbacee306c171373be"} Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.738541 4839 scope.go:117] "RemoveContainer" containerID="2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.738702 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fh65n" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.762893 4839 scope.go:117] "RemoveContainer" containerID="bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.783905 4839 scope.go:117] "RemoveContainer" containerID="68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.829455 4839 scope.go:117] "RemoveContainer" containerID="2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a" Mar 21 05:11:32 crc kubenswrapper[4839]: E0321 05:11:32.830079 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a\": container with ID starting with 2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a not found: ID does not exist" containerID="2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.830129 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a"} err="failed to get container status \"2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a\": rpc error: code = NotFound desc = could not find container \"2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a\": container with ID starting with 2ca0749692ccec6e33ef8ee8342a8dfda5712c41830dde3cd8fb1bca6436d60a not found: ID does not exist" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.830156 4839 scope.go:117] "RemoveContainer" containerID="bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5" Mar 21 05:11:32 crc kubenswrapper[4839]: E0321 05:11:32.830540 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5\": container with ID starting with bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5 not found: ID does not exist" containerID="bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.830566 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5"} err="failed to get container status \"bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5\": rpc error: code = NotFound desc = could not find container \"bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5\": container with ID starting with bae945bd03b95ffa50789d0010748dc9fb1a42844d6aa29e54ac008374bc7eb5 not found: ID does not exist" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.830592 4839 scope.go:117] "RemoveContainer" containerID="68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883" Mar 21 05:11:32 crc kubenswrapper[4839]: E0321 05:11:32.830841 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883\": container with ID starting with 68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883 not found: ID does not exist" containerID="68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883" Mar 21 05:11:32 crc kubenswrapper[4839]: I0321 05:11:32.830884 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883"} err="failed to get container status \"68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883\": rpc error: code = NotFound desc = could not find container \"68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883\": container with ID starting with 68c35cefa145fbe9f7c4e26de6ffc1ef0cceda473451ac306382c83dcdb3c883 not found: ID does not exist" Mar 21 05:11:33 crc kubenswrapper[4839]: I0321 05:11:33.166756 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" (UID: "b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:11:33 crc kubenswrapper[4839]: I0321 05:11:33.226952 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:11:33 crc kubenswrapper[4839]: I0321 05:11:33.540295 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fh65n"] Mar 21 05:11:33 crc kubenswrapper[4839]: I0321 05:11:33.550561 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fh65n"] Mar 21 05:11:34 crc kubenswrapper[4839]: I0321 05:11:34.464851 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" path="/var/lib/kubelet/pods/b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e/volumes" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.153682 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567832-p2sbz"] Mar 21 05:12:00 crc kubenswrapper[4839]: E0321 05:12:00.154516 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerName="extract-content" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.154530 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerName="extract-content" Mar 21 05:12:00 crc kubenswrapper[4839]: E0321 05:12:00.154569 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerName="extract-utilities" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.154591 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerName="extract-utilities" Mar 21 05:12:00 crc kubenswrapper[4839]: E0321 05:12:00.154601 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerName="registry-server" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.154607 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerName="registry-server" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.154769 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1dd30d5-cf1b-46d1-bea2-1e3298f2f53e" containerName="registry-server" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.155461 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567832-p2sbz" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.157341 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.157378 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.157737 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.161338 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567832-p2sbz"] Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.218439 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdsg7\" (UniqueName: \"kubernetes.io/projected/82138c0f-eab6-4265-8db8-a8a1d934493a-kube-api-access-hdsg7\") pod \"auto-csr-approver-29567832-p2sbz\" (UID: \"82138c0f-eab6-4265-8db8-a8a1d934493a\") " pod="openshift-infra/auto-csr-approver-29567832-p2sbz" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.319785 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdsg7\" (UniqueName: \"kubernetes.io/projected/82138c0f-eab6-4265-8db8-a8a1d934493a-kube-api-access-hdsg7\") pod \"auto-csr-approver-29567832-p2sbz\" (UID: \"82138c0f-eab6-4265-8db8-a8a1d934493a\") " pod="openshift-infra/auto-csr-approver-29567832-p2sbz" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.340287 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdsg7\" (UniqueName: \"kubernetes.io/projected/82138c0f-eab6-4265-8db8-a8a1d934493a-kube-api-access-hdsg7\") pod \"auto-csr-approver-29567832-p2sbz\" (UID: \"82138c0f-eab6-4265-8db8-a8a1d934493a\") " pod="openshift-infra/auto-csr-approver-29567832-p2sbz" Mar 21 05:12:00 crc kubenswrapper[4839]: I0321 05:12:00.473868 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567832-p2sbz" Mar 21 05:12:01 crc kubenswrapper[4839]: I0321 05:12:01.182346 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567832-p2sbz"] Mar 21 05:12:02 crc kubenswrapper[4839]: I0321 05:12:02.012638 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567832-p2sbz" event={"ID":"82138c0f-eab6-4265-8db8-a8a1d934493a","Type":"ContainerStarted","Data":"c3d42162bd8f80621e321b47a8de003804f416494da4aef352c458177b7ef499"} Mar 21 05:12:03 crc kubenswrapper[4839]: I0321 05:12:03.023955 4839 generic.go:334] "Generic (PLEG): container finished" podID="82138c0f-eab6-4265-8db8-a8a1d934493a" containerID="3318757ff7e10c85856bb47e2aa55c12ee4b6d281f642a8f4e0f9c93681e4785" exitCode=0 Mar 21 05:12:03 crc kubenswrapper[4839]: I0321 05:12:03.024056 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567832-p2sbz" event={"ID":"82138c0f-eab6-4265-8db8-a8a1d934493a","Type":"ContainerDied","Data":"3318757ff7e10c85856bb47e2aa55c12ee4b6d281f642a8f4e0f9c93681e4785"} Mar 21 05:12:04 crc kubenswrapper[4839]: I0321 05:12:04.431675 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567832-p2sbz" Mar 21 05:12:04 crc kubenswrapper[4839]: I0321 05:12:04.536183 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdsg7\" (UniqueName: \"kubernetes.io/projected/82138c0f-eab6-4265-8db8-a8a1d934493a-kube-api-access-hdsg7\") pod \"82138c0f-eab6-4265-8db8-a8a1d934493a\" (UID: \"82138c0f-eab6-4265-8db8-a8a1d934493a\") " Mar 21 05:12:04 crc kubenswrapper[4839]: I0321 05:12:04.542311 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82138c0f-eab6-4265-8db8-a8a1d934493a-kube-api-access-hdsg7" (OuterVolumeSpecName: "kube-api-access-hdsg7") pod "82138c0f-eab6-4265-8db8-a8a1d934493a" (UID: "82138c0f-eab6-4265-8db8-a8a1d934493a"). InnerVolumeSpecName "kube-api-access-hdsg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:12:04 crc kubenswrapper[4839]: I0321 05:12:04.639200 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdsg7\" (UniqueName: \"kubernetes.io/projected/82138c0f-eab6-4265-8db8-a8a1d934493a-kube-api-access-hdsg7\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:05 crc kubenswrapper[4839]: I0321 05:12:05.042181 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567832-p2sbz" event={"ID":"82138c0f-eab6-4265-8db8-a8a1d934493a","Type":"ContainerDied","Data":"c3d42162bd8f80621e321b47a8de003804f416494da4aef352c458177b7ef499"} Mar 21 05:12:05 crc kubenswrapper[4839]: I0321 05:12:05.042522 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3d42162bd8f80621e321b47a8de003804f416494da4aef352c458177b7ef499" Mar 21 05:12:05 crc kubenswrapper[4839]: I0321 05:12:05.042236 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567832-p2sbz" Mar 21 05:12:05 crc kubenswrapper[4839]: I0321 05:12:05.500881 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567826-x4cdd"] Mar 21 05:12:05 crc kubenswrapper[4839]: I0321 05:12:05.510130 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567826-x4cdd"] Mar 21 05:12:06 crc kubenswrapper[4839]: I0321 05:12:06.465289 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba8ac9dd-e3e7-4e21-9286-731d926d9580" path="/var/lib/kubelet/pods/ba8ac9dd-e3e7-4e21-9286-731d926d9580/volumes" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.309765 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ffqlt"] Mar 21 05:12:16 crc kubenswrapper[4839]: E0321 05:12:16.310854 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82138c0f-eab6-4265-8db8-a8a1d934493a" containerName="oc" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.310875 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="82138c0f-eab6-4265-8db8-a8a1d934493a" containerName="oc" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.311094 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="82138c0f-eab6-4265-8db8-a8a1d934493a" containerName="oc" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.313383 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.323376 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ffqlt"] Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.499387 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqcr9\" (UniqueName: \"kubernetes.io/projected/2caa3218-47ca-4a13-aa31-bc551be3b478-kube-api-access-nqcr9\") pod \"redhat-marketplace-ffqlt\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.500413 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-catalog-content\") pod \"redhat-marketplace-ffqlt\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.500506 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-utilities\") pod \"redhat-marketplace-ffqlt\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.603643 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-utilities\") pod \"redhat-marketplace-ffqlt\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.603790 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqcr9\" (UniqueName: \"kubernetes.io/projected/2caa3218-47ca-4a13-aa31-bc551be3b478-kube-api-access-nqcr9\") pod \"redhat-marketplace-ffqlt\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.603960 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-catalog-content\") pod \"redhat-marketplace-ffqlt\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.605714 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-utilities\") pod \"redhat-marketplace-ffqlt\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.607043 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-catalog-content\") pod \"redhat-marketplace-ffqlt\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.639434 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqcr9\" (UniqueName: \"kubernetes.io/projected/2caa3218-47ca-4a13-aa31-bc551be3b478-kube-api-access-nqcr9\") pod \"redhat-marketplace-ffqlt\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:16 crc kubenswrapper[4839]: I0321 05:12:16.937778 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:19 crc kubenswrapper[4839]: I0321 05:12:19.422861 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ffqlt"] Mar 21 05:12:20 crc kubenswrapper[4839]: I0321 05:12:20.110202 4839 generic.go:334] "Generic (PLEG): container finished" podID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerID="70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d" exitCode=0 Mar 21 05:12:20 crc kubenswrapper[4839]: I0321 05:12:20.110293 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffqlt" event={"ID":"2caa3218-47ca-4a13-aa31-bc551be3b478","Type":"ContainerDied","Data":"70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d"} Mar 21 05:12:20 crc kubenswrapper[4839]: I0321 05:12:20.110490 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffqlt" event={"ID":"2caa3218-47ca-4a13-aa31-bc551be3b478","Type":"ContainerStarted","Data":"8e022772352c8be147feb12372aa8cb33c79e9b947e4b88f3cb5a88ac672c9cf"} Mar 21 05:12:21 crc kubenswrapper[4839]: I0321 05:12:21.120765 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffqlt" event={"ID":"2caa3218-47ca-4a13-aa31-bc551be3b478","Type":"ContainerStarted","Data":"4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46"} Mar 21 05:12:22 crc kubenswrapper[4839]: I0321 05:12:22.136854 4839 generic.go:334] "Generic (PLEG): container finished" podID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerID="4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46" exitCode=0 Mar 21 05:12:22 crc kubenswrapper[4839]: I0321 05:12:22.137169 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffqlt" event={"ID":"2caa3218-47ca-4a13-aa31-bc551be3b478","Type":"ContainerDied","Data":"4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46"} Mar 21 05:12:23 crc kubenswrapper[4839]: I0321 05:12:23.150851 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffqlt" event={"ID":"2caa3218-47ca-4a13-aa31-bc551be3b478","Type":"ContainerStarted","Data":"52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264"} Mar 21 05:12:23 crc kubenswrapper[4839]: I0321 05:12:23.176919 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ffqlt" podStartSLOduration=4.495910051 podStartE2EDuration="7.176897296s" podCreationTimestamp="2026-03-21 05:12:16 +0000 UTC" firstStartedPulling="2026-03-21 05:12:20.11194645 +0000 UTC m=+2944.439733126" lastFinishedPulling="2026-03-21 05:12:22.792933695 +0000 UTC m=+2947.120720371" observedRunningTime="2026-03-21 05:12:23.171188245 +0000 UTC m=+2947.498974931" watchObservedRunningTime="2026-03-21 05:12:23.176897296 +0000 UTC m=+2947.504683972" Mar 21 05:12:26 crc kubenswrapper[4839]: I0321 05:12:26.938551 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:26 crc kubenswrapper[4839]: I0321 05:12:26.940787 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:26 crc kubenswrapper[4839]: I0321 05:12:26.994283 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:27 crc kubenswrapper[4839]: I0321 05:12:27.246973 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:27 crc kubenswrapper[4839]: I0321 05:12:27.305798 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ffqlt"] Mar 21 05:12:29 crc kubenswrapper[4839]: I0321 05:12:29.212906 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ffqlt" podUID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerName="registry-server" containerID="cri-o://52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264" gracePeriod=2 Mar 21 05:12:29 crc kubenswrapper[4839]: I0321 05:12:29.812805 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:29 crc kubenswrapper[4839]: I0321 05:12:29.913238 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-utilities\") pod \"2caa3218-47ca-4a13-aa31-bc551be3b478\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " Mar 21 05:12:29 crc kubenswrapper[4839]: I0321 05:12:29.914087 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqcr9\" (UniqueName: \"kubernetes.io/projected/2caa3218-47ca-4a13-aa31-bc551be3b478-kube-api-access-nqcr9\") pod \"2caa3218-47ca-4a13-aa31-bc551be3b478\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " Mar 21 05:12:29 crc kubenswrapper[4839]: I0321 05:12:29.914184 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-catalog-content\") pod \"2caa3218-47ca-4a13-aa31-bc551be3b478\" (UID: \"2caa3218-47ca-4a13-aa31-bc551be3b478\") " Mar 21 05:12:29 crc kubenswrapper[4839]: I0321 05:12:29.914341 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-utilities" (OuterVolumeSpecName: "utilities") pod "2caa3218-47ca-4a13-aa31-bc551be3b478" (UID: "2caa3218-47ca-4a13-aa31-bc551be3b478"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:12:29 crc kubenswrapper[4839]: I0321 05:12:29.915060 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:29 crc kubenswrapper[4839]: I0321 05:12:29.919699 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2caa3218-47ca-4a13-aa31-bc551be3b478-kube-api-access-nqcr9" (OuterVolumeSpecName: "kube-api-access-nqcr9") pod "2caa3218-47ca-4a13-aa31-bc551be3b478" (UID: "2caa3218-47ca-4a13-aa31-bc551be3b478"). InnerVolumeSpecName "kube-api-access-nqcr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:12:29 crc kubenswrapper[4839]: I0321 05:12:29.943272 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2caa3218-47ca-4a13-aa31-bc551be3b478" (UID: "2caa3218-47ca-4a13-aa31-bc551be3b478"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.016498 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqcr9\" (UniqueName: \"kubernetes.io/projected/2caa3218-47ca-4a13-aa31-bc551be3b478-kube-api-access-nqcr9\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.016536 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2caa3218-47ca-4a13-aa31-bc551be3b478-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.226685 4839 generic.go:334] "Generic (PLEG): container finished" podID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerID="52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264" exitCode=0 Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.226738 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffqlt" event={"ID":"2caa3218-47ca-4a13-aa31-bc551be3b478","Type":"ContainerDied","Data":"52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264"} Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.226749 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ffqlt" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.226772 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ffqlt" event={"ID":"2caa3218-47ca-4a13-aa31-bc551be3b478","Type":"ContainerDied","Data":"8e022772352c8be147feb12372aa8cb33c79e9b947e4b88f3cb5a88ac672c9cf"} Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.226796 4839 scope.go:117] "RemoveContainer" containerID="52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.254656 4839 scope.go:117] "RemoveContainer" containerID="4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.264368 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ffqlt"] Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.279430 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ffqlt"] Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.288101 4839 scope.go:117] "RemoveContainer" containerID="70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.322283 4839 scope.go:117] "RemoveContainer" containerID="52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264" Mar 21 05:12:30 crc kubenswrapper[4839]: E0321 05:12:30.322839 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264\": container with ID starting with 52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264 not found: ID does not exist" containerID="52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.322884 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264"} err="failed to get container status \"52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264\": rpc error: code = NotFound desc = could not find container \"52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264\": container with ID starting with 52ce7e99b9b95f19a033cedb3c3ca847e7115e58b04648595d7e014abffa9264 not found: ID does not exist" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.322916 4839 scope.go:117] "RemoveContainer" containerID="4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46" Mar 21 05:12:30 crc kubenswrapper[4839]: E0321 05:12:30.326896 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46\": container with ID starting with 4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46 not found: ID does not exist" containerID="4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.326937 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46"} err="failed to get container status \"4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46\": rpc error: code = NotFound desc = could not find container \"4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46\": container with ID starting with 4b5fb6b6cc2d5f06a80ee49c54e037d42b230620b1d7edfaf461b6aecf85af46 not found: ID does not exist" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.326964 4839 scope.go:117] "RemoveContainer" containerID="70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d" Mar 21 05:12:30 crc kubenswrapper[4839]: E0321 05:12:30.327303 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d\": container with ID starting with 70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d not found: ID does not exist" containerID="70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.327325 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d"} err="failed to get container status \"70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d\": rpc error: code = NotFound desc = could not find container \"70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d\": container with ID starting with 70a00c3a25b1779ac37ba330d9568d5a63f1d05aa1ec5bf3c6df68ccff242a5d not found: ID does not exist" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.466486 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2caa3218-47ca-4a13-aa31-bc551be3b478" path="/var/lib/kubelet/pods/2caa3218-47ca-4a13-aa31-bc551be3b478/volumes" Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.980126 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:12:30 crc kubenswrapper[4839]: I0321 05:12:30.980388 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.396668 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9cfwq"] Mar 21 05:12:41 crc kubenswrapper[4839]: E0321 05:12:41.397904 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerName="extract-content" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.397920 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerName="extract-content" Mar 21 05:12:41 crc kubenswrapper[4839]: E0321 05:12:41.397943 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerName="registry-server" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.397948 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerName="registry-server" Mar 21 05:12:41 crc kubenswrapper[4839]: E0321 05:12:41.397977 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerName="extract-utilities" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.397983 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerName="extract-utilities" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.398147 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="2caa3218-47ca-4a13-aa31-bc551be3b478" containerName="registry-server" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.399465 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.417438 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9cfwq"] Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.670309 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-utilities\") pod \"certified-operators-9cfwq\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.670609 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-catalog-content\") pod \"certified-operators-9cfwq\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.670688 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhqvg\" (UniqueName: \"kubernetes.io/projected/759f8f2c-b554-4a39-80d1-ec067ebec86f-kube-api-access-dhqvg\") pod \"certified-operators-9cfwq\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.772557 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhqvg\" (UniqueName: \"kubernetes.io/projected/759f8f2c-b554-4a39-80d1-ec067ebec86f-kube-api-access-dhqvg\") pod \"certified-operators-9cfwq\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.772653 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-utilities\") pod \"certified-operators-9cfwq\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.772828 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-catalog-content\") pod \"certified-operators-9cfwq\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.773369 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-utilities\") pod \"certified-operators-9cfwq\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.773410 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-catalog-content\") pod \"certified-operators-9cfwq\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.795474 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhqvg\" (UniqueName: \"kubernetes.io/projected/759f8f2c-b554-4a39-80d1-ec067ebec86f-kube-api-access-dhqvg\") pod \"certified-operators-9cfwq\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:41 crc kubenswrapper[4839]: I0321 05:12:41.883104 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:42 crc kubenswrapper[4839]: I0321 05:12:42.441397 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9cfwq"] Mar 21 05:12:43 crc kubenswrapper[4839]: I0321 05:12:43.351975 4839 generic.go:334] "Generic (PLEG): container finished" podID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerID="0471e9eacbbda33ca133e7b38f3650ac296123484074595f33648044768e5d3e" exitCode=0 Mar 21 05:12:43 crc kubenswrapper[4839]: I0321 05:12:43.352071 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cfwq" event={"ID":"759f8f2c-b554-4a39-80d1-ec067ebec86f","Type":"ContainerDied","Data":"0471e9eacbbda33ca133e7b38f3650ac296123484074595f33648044768e5d3e"} Mar 21 05:12:43 crc kubenswrapper[4839]: I0321 05:12:43.352547 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cfwq" event={"ID":"759f8f2c-b554-4a39-80d1-ec067ebec86f","Type":"ContainerStarted","Data":"ff9c7d348116e9c8ef37ba10f189bce01fe1fe3bb594e044bacc7cb9f6670753"} Mar 21 05:12:44 crc kubenswrapper[4839]: I0321 05:12:44.363812 4839 generic.go:334] "Generic (PLEG): container finished" podID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerID="dc3562202bfc3445fc78d5123c12166dd88a29b9e49b2c01bd89f5c2b4c0b9fb" exitCode=0 Mar 21 05:12:44 crc kubenswrapper[4839]: I0321 05:12:44.363895 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cfwq" event={"ID":"759f8f2c-b554-4a39-80d1-ec067ebec86f","Type":"ContainerDied","Data":"dc3562202bfc3445fc78d5123c12166dd88a29b9e49b2c01bd89f5c2b4c0b9fb"} Mar 21 05:12:46 crc kubenswrapper[4839]: I0321 05:12:46.703623 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cfwq" event={"ID":"759f8f2c-b554-4a39-80d1-ec067ebec86f","Type":"ContainerStarted","Data":"3e0ce864b7433a6da2941869e9cf4394273201d0b5d8409bc391e6e0b7347210"} Mar 21 05:12:46 crc kubenswrapper[4839]: I0321 05:12:46.730441 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9cfwq" podStartSLOduration=3.3541314509999998 podStartE2EDuration="5.730419505s" podCreationTimestamp="2026-03-21 05:12:41 +0000 UTC" firstStartedPulling="2026-03-21 05:12:43.353695405 +0000 UTC m=+2967.681482081" lastFinishedPulling="2026-03-21 05:12:45.729983459 +0000 UTC m=+2970.057770135" observedRunningTime="2026-03-21 05:12:46.726632578 +0000 UTC m=+2971.054419264" watchObservedRunningTime="2026-03-21 05:12:46.730419505 +0000 UTC m=+2971.058206181" Mar 21 05:12:51 crc kubenswrapper[4839]: I0321 05:12:51.883950 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:51 crc kubenswrapper[4839]: I0321 05:12:51.884657 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:51 crc kubenswrapper[4839]: I0321 05:12:51.928120 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:52 crc kubenswrapper[4839]: I0321 05:12:52.801221 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:52 crc kubenswrapper[4839]: I0321 05:12:52.872462 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9cfwq"] Mar 21 05:12:54 crc kubenswrapper[4839]: I0321 05:12:54.763725 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9cfwq" podUID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerName="registry-server" containerID="cri-o://3e0ce864b7433a6da2941869e9cf4394273201d0b5d8409bc391e6e0b7347210" gracePeriod=2 Mar 21 05:12:55 crc kubenswrapper[4839]: I0321 05:12:55.776202 4839 generic.go:334] "Generic (PLEG): container finished" podID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerID="3e0ce864b7433a6da2941869e9cf4394273201d0b5d8409bc391e6e0b7347210" exitCode=0 Mar 21 05:12:55 crc kubenswrapper[4839]: I0321 05:12:55.776215 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cfwq" event={"ID":"759f8f2c-b554-4a39-80d1-ec067ebec86f","Type":"ContainerDied","Data":"3e0ce864b7433a6da2941869e9cf4394273201d0b5d8409bc391e6e0b7347210"} Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.098373 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.267223 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-catalog-content\") pod \"759f8f2c-b554-4a39-80d1-ec067ebec86f\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.267380 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-utilities\") pod \"759f8f2c-b554-4a39-80d1-ec067ebec86f\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.267838 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhqvg\" (UniqueName: \"kubernetes.io/projected/759f8f2c-b554-4a39-80d1-ec067ebec86f-kube-api-access-dhqvg\") pod \"759f8f2c-b554-4a39-80d1-ec067ebec86f\" (UID: \"759f8f2c-b554-4a39-80d1-ec067ebec86f\") " Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.268121 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-utilities" (OuterVolumeSpecName: "utilities") pod "759f8f2c-b554-4a39-80d1-ec067ebec86f" (UID: "759f8f2c-b554-4a39-80d1-ec067ebec86f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.268442 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.274169 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/759f8f2c-b554-4a39-80d1-ec067ebec86f-kube-api-access-dhqvg" (OuterVolumeSpecName: "kube-api-access-dhqvg") pod "759f8f2c-b554-4a39-80d1-ec067ebec86f" (UID: "759f8f2c-b554-4a39-80d1-ec067ebec86f"). InnerVolumeSpecName "kube-api-access-dhqvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.326220 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "759f8f2c-b554-4a39-80d1-ec067ebec86f" (UID: "759f8f2c-b554-4a39-80d1-ec067ebec86f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.387541 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhqvg\" (UniqueName: \"kubernetes.io/projected/759f8f2c-b554-4a39-80d1-ec067ebec86f-kube-api-access-dhqvg\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.387606 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/759f8f2c-b554-4a39-80d1-ec067ebec86f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.487694 4839 scope.go:117] "RemoveContainer" containerID="1be67ef407b003d168ed5f91777a4df15466b61c19dea5b77ca6763eff6dadb2" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.789003 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9cfwq" event={"ID":"759f8f2c-b554-4a39-80d1-ec067ebec86f","Type":"ContainerDied","Data":"ff9c7d348116e9c8ef37ba10f189bce01fe1fe3bb594e044bacc7cb9f6670753"} Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.790469 4839 scope.go:117] "RemoveContainer" containerID="3e0ce864b7433a6da2941869e9cf4394273201d0b5d8409bc391e6e0b7347210" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.789389 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9cfwq" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.815876 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9cfwq"] Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.819839 4839 scope.go:117] "RemoveContainer" containerID="dc3562202bfc3445fc78d5123c12166dd88a29b9e49b2c01bd89f5c2b4c0b9fb" Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.825388 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9cfwq"] Mar 21 05:12:56 crc kubenswrapper[4839]: I0321 05:12:56.840617 4839 scope.go:117] "RemoveContainer" containerID="0471e9eacbbda33ca133e7b38f3650ac296123484074595f33648044768e5d3e" Mar 21 05:12:58 crc kubenswrapper[4839]: I0321 05:12:58.462740 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="759f8f2c-b554-4a39-80d1-ec067ebec86f" path="/var/lib/kubelet/pods/759f8f2c-b554-4a39-80d1-ec067ebec86f/volumes" Mar 21 05:13:00 crc kubenswrapper[4839]: I0321 05:13:00.980554 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:13:00 crc kubenswrapper[4839]: I0321 05:13:00.981297 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:13:30 crc kubenswrapper[4839]: I0321 05:13:30.979809 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:13:30 crc kubenswrapper[4839]: I0321 05:13:30.980489 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:13:30 crc kubenswrapper[4839]: I0321 05:13:30.980559 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 05:13:30 crc kubenswrapper[4839]: I0321 05:13:30.981542 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b3b7e174e088400e045d778d09a66442ae3aba7106eba8585b97c2b9d3b1ab21"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:13:30 crc kubenswrapper[4839]: I0321 05:13:30.981657 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://b3b7e174e088400e045d778d09a66442ae3aba7106eba8585b97c2b9d3b1ab21" gracePeriod=600 Mar 21 05:13:31 crc kubenswrapper[4839]: I0321 05:13:31.137229 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="b3b7e174e088400e045d778d09a66442ae3aba7106eba8585b97c2b9d3b1ab21" exitCode=0 Mar 21 05:13:31 crc kubenswrapper[4839]: I0321 05:13:31.137272 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"b3b7e174e088400e045d778d09a66442ae3aba7106eba8585b97c2b9d3b1ab21"} Mar 21 05:13:31 crc kubenswrapper[4839]: I0321 05:13:31.137303 4839 scope.go:117] "RemoveContainer" containerID="baf34d4219b300148cd1cf3e2853a6f9d40af9c7af97b863f8a2a90b0a187c21" Mar 21 05:13:32 crc kubenswrapper[4839]: I0321 05:13:32.155735 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2"} Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.143687 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567834-zmfrq"] Mar 21 05:14:00 crc kubenswrapper[4839]: E0321 05:14:00.144587 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerName="extract-utilities" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.144606 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerName="extract-utilities" Mar 21 05:14:00 crc kubenswrapper[4839]: E0321 05:14:00.144625 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerName="registry-server" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.144632 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerName="registry-server" Mar 21 05:14:00 crc kubenswrapper[4839]: E0321 05:14:00.144653 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerName="extract-content" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.144660 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerName="extract-content" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.144885 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="759f8f2c-b554-4a39-80d1-ec067ebec86f" containerName="registry-server" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.145471 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567834-zmfrq" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.148074 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.148299 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.149059 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.156306 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567834-zmfrq"] Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.292673 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4tqb\" (UniqueName: \"kubernetes.io/projected/acfde7eb-12d0-4baf-9958-3ff93b290071-kube-api-access-g4tqb\") pod \"auto-csr-approver-29567834-zmfrq\" (UID: \"acfde7eb-12d0-4baf-9958-3ff93b290071\") " pod="openshift-infra/auto-csr-approver-29567834-zmfrq" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.395675 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4tqb\" (UniqueName: \"kubernetes.io/projected/acfde7eb-12d0-4baf-9958-3ff93b290071-kube-api-access-g4tqb\") pod \"auto-csr-approver-29567834-zmfrq\" (UID: \"acfde7eb-12d0-4baf-9958-3ff93b290071\") " pod="openshift-infra/auto-csr-approver-29567834-zmfrq" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.432261 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4tqb\" (UniqueName: \"kubernetes.io/projected/acfde7eb-12d0-4baf-9958-3ff93b290071-kube-api-access-g4tqb\") pod \"auto-csr-approver-29567834-zmfrq\" (UID: \"acfde7eb-12d0-4baf-9958-3ff93b290071\") " pod="openshift-infra/auto-csr-approver-29567834-zmfrq" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.465675 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567834-zmfrq" Mar 21 05:14:00 crc kubenswrapper[4839]: I0321 05:14:00.923945 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567834-zmfrq"] Mar 21 05:14:00 crc kubenswrapper[4839]: W0321 05:14:00.925875 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacfde7eb_12d0_4baf_9958_3ff93b290071.slice/crio-f1abffb51a446bee24be83a02417f55dd6d29514c9693e93e4b0052d9cfa8ef4 WatchSource:0}: Error finding container f1abffb51a446bee24be83a02417f55dd6d29514c9693e93e4b0052d9cfa8ef4: Status 404 returned error can't find the container with id f1abffb51a446bee24be83a02417f55dd6d29514c9693e93e4b0052d9cfa8ef4 Mar 21 05:14:01 crc kubenswrapper[4839]: I0321 05:14:01.428578 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567834-zmfrq" event={"ID":"acfde7eb-12d0-4baf-9958-3ff93b290071","Type":"ContainerStarted","Data":"f1abffb51a446bee24be83a02417f55dd6d29514c9693e93e4b0052d9cfa8ef4"} Mar 21 05:14:02 crc kubenswrapper[4839]: I0321 05:14:02.441008 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567834-zmfrq" event={"ID":"acfde7eb-12d0-4baf-9958-3ff93b290071","Type":"ContainerStarted","Data":"93fa052ae298171aa1d976a2185cd4fffe6a03fbc4b347a5bb68165274eaac3a"} Mar 21 05:14:02 crc kubenswrapper[4839]: I0321 05:14:02.457162 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567834-zmfrq" podStartSLOduration=1.525210737 podStartE2EDuration="2.457145029s" podCreationTimestamp="2026-03-21 05:14:00 +0000 UTC" firstStartedPulling="2026-03-21 05:14:00.92818075 +0000 UTC m=+3045.255967426" lastFinishedPulling="2026-03-21 05:14:01.860115032 +0000 UTC m=+3046.187901718" observedRunningTime="2026-03-21 05:14:02.453514216 +0000 UTC m=+3046.781300892" watchObservedRunningTime="2026-03-21 05:14:02.457145029 +0000 UTC m=+3046.784931705" Mar 21 05:14:03 crc kubenswrapper[4839]: I0321 05:14:03.450665 4839 generic.go:334] "Generic (PLEG): container finished" podID="acfde7eb-12d0-4baf-9958-3ff93b290071" containerID="93fa052ae298171aa1d976a2185cd4fffe6a03fbc4b347a5bb68165274eaac3a" exitCode=0 Mar 21 05:14:03 crc kubenswrapper[4839]: I0321 05:14:03.450713 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567834-zmfrq" event={"ID":"acfde7eb-12d0-4baf-9958-3ff93b290071","Type":"ContainerDied","Data":"93fa052ae298171aa1d976a2185cd4fffe6a03fbc4b347a5bb68165274eaac3a"} Mar 21 05:14:04 crc kubenswrapper[4839]: I0321 05:14:04.831923 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567834-zmfrq" Mar 21 05:14:04 crc kubenswrapper[4839]: I0321 05:14:04.873512 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4tqb\" (UniqueName: \"kubernetes.io/projected/acfde7eb-12d0-4baf-9958-3ff93b290071-kube-api-access-g4tqb\") pod \"acfde7eb-12d0-4baf-9958-3ff93b290071\" (UID: \"acfde7eb-12d0-4baf-9958-3ff93b290071\") " Mar 21 05:14:04 crc kubenswrapper[4839]: I0321 05:14:04.897387 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acfde7eb-12d0-4baf-9958-3ff93b290071-kube-api-access-g4tqb" (OuterVolumeSpecName: "kube-api-access-g4tqb") pod "acfde7eb-12d0-4baf-9958-3ff93b290071" (UID: "acfde7eb-12d0-4baf-9958-3ff93b290071"). InnerVolumeSpecName "kube-api-access-g4tqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:14:04 crc kubenswrapper[4839]: I0321 05:14:04.975373 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4tqb\" (UniqueName: \"kubernetes.io/projected/acfde7eb-12d0-4baf-9958-3ff93b290071-kube-api-access-g4tqb\") on node \"crc\" DevicePath \"\"" Mar 21 05:14:05 crc kubenswrapper[4839]: I0321 05:14:05.469448 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567834-zmfrq" event={"ID":"acfde7eb-12d0-4baf-9958-3ff93b290071","Type":"ContainerDied","Data":"f1abffb51a446bee24be83a02417f55dd6d29514c9693e93e4b0052d9cfa8ef4"} Mar 21 05:14:05 crc kubenswrapper[4839]: I0321 05:14:05.469808 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1abffb51a446bee24be83a02417f55dd6d29514c9693e93e4b0052d9cfa8ef4" Mar 21 05:14:05 crc kubenswrapper[4839]: I0321 05:14:05.469805 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567834-zmfrq" Mar 21 05:14:05 crc kubenswrapper[4839]: I0321 05:14:05.520650 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567828-qnptz"] Mar 21 05:14:05 crc kubenswrapper[4839]: I0321 05:14:05.532957 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567828-qnptz"] Mar 21 05:14:06 crc kubenswrapper[4839]: I0321 05:14:06.463747 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e550e6e6-fc33-4703-b8db-6cd8169ebc7f" path="/var/lib/kubelet/pods/e550e6e6-fc33-4703-b8db-6cd8169ebc7f/volumes" Mar 21 05:14:56 crc kubenswrapper[4839]: I0321 05:14:56.620553 4839 scope.go:117] "RemoveContainer" containerID="4eee1ce2fe1133d7d9ea3d85cfff635c2acc34be75ad8890d23c93b28ff12298" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.142680 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd"] Mar 21 05:15:00 crc kubenswrapper[4839]: E0321 05:15:00.143603 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acfde7eb-12d0-4baf-9958-3ff93b290071" containerName="oc" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.143618 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="acfde7eb-12d0-4baf-9958-3ff93b290071" containerName="oc" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.143825 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="acfde7eb-12d0-4baf-9958-3ff93b290071" containerName="oc" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.144436 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.150454 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.150503 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.159370 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd"] Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.253963 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e54425e5-2050-4510-be6a-ef16c4311765-config-volume\") pod \"collect-profiles-29567835-8vmjd\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.254077 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt46f\" (UniqueName: \"kubernetes.io/projected/e54425e5-2050-4510-be6a-ef16c4311765-kube-api-access-xt46f\") pod \"collect-profiles-29567835-8vmjd\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.254150 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e54425e5-2050-4510-be6a-ef16c4311765-secret-volume\") pod \"collect-profiles-29567835-8vmjd\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.356150 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt46f\" (UniqueName: \"kubernetes.io/projected/e54425e5-2050-4510-be6a-ef16c4311765-kube-api-access-xt46f\") pod \"collect-profiles-29567835-8vmjd\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.356249 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e54425e5-2050-4510-be6a-ef16c4311765-secret-volume\") pod \"collect-profiles-29567835-8vmjd\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.356353 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e54425e5-2050-4510-be6a-ef16c4311765-config-volume\") pod \"collect-profiles-29567835-8vmjd\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.357438 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e54425e5-2050-4510-be6a-ef16c4311765-config-volume\") pod \"collect-profiles-29567835-8vmjd\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.362222 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e54425e5-2050-4510-be6a-ef16c4311765-secret-volume\") pod \"collect-profiles-29567835-8vmjd\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.375006 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt46f\" (UniqueName: \"kubernetes.io/projected/e54425e5-2050-4510-be6a-ef16c4311765-kube-api-access-xt46f\") pod \"collect-profiles-29567835-8vmjd\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.469041 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.915052 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd"] Mar 21 05:15:00 crc kubenswrapper[4839]: W0321 05:15:00.925216 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode54425e5_2050_4510_be6a_ef16c4311765.slice/crio-9f5822211a0a0dc8a462f22ebe73630d12cd9c388d680042f0d3c5791a530c45 WatchSource:0}: Error finding container 9f5822211a0a0dc8a462f22ebe73630d12cd9c388d680042f0d3c5791a530c45: Status 404 returned error can't find the container with id 9f5822211a0a0dc8a462f22ebe73630d12cd9c388d680042f0d3c5791a530c45 Mar 21 05:15:00 crc kubenswrapper[4839]: I0321 05:15:00.949580 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" event={"ID":"e54425e5-2050-4510-be6a-ef16c4311765","Type":"ContainerStarted","Data":"9f5822211a0a0dc8a462f22ebe73630d12cd9c388d680042f0d3c5791a530c45"} Mar 21 05:15:01 crc kubenswrapper[4839]: I0321 05:15:01.958196 4839 generic.go:334] "Generic (PLEG): container finished" podID="e54425e5-2050-4510-be6a-ef16c4311765" containerID="2048e73b844bad4a2409cb352a14732d0691192b6583598f901a01e2393f2a26" exitCode=0 Mar 21 05:15:01 crc kubenswrapper[4839]: I0321 05:15:01.958256 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" event={"ID":"e54425e5-2050-4510-be6a-ef16c4311765","Type":"ContainerDied","Data":"2048e73b844bad4a2409cb352a14732d0691192b6583598f901a01e2393f2a26"} Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.339259 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.415818 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e54425e5-2050-4510-be6a-ef16c4311765-secret-volume\") pod \"e54425e5-2050-4510-be6a-ef16c4311765\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.416190 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e54425e5-2050-4510-be6a-ef16c4311765-config-volume\") pod \"e54425e5-2050-4510-be6a-ef16c4311765\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.416438 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt46f\" (UniqueName: \"kubernetes.io/projected/e54425e5-2050-4510-be6a-ef16c4311765-kube-api-access-xt46f\") pod \"e54425e5-2050-4510-be6a-ef16c4311765\" (UID: \"e54425e5-2050-4510-be6a-ef16c4311765\") " Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.418418 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e54425e5-2050-4510-be6a-ef16c4311765-config-volume" (OuterVolumeSpecName: "config-volume") pod "e54425e5-2050-4510-be6a-ef16c4311765" (UID: "e54425e5-2050-4510-be6a-ef16c4311765"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.427258 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e54425e5-2050-4510-be6a-ef16c4311765-kube-api-access-xt46f" (OuterVolumeSpecName: "kube-api-access-xt46f") pod "e54425e5-2050-4510-be6a-ef16c4311765" (UID: "e54425e5-2050-4510-be6a-ef16c4311765"). InnerVolumeSpecName "kube-api-access-xt46f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.435767 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e54425e5-2050-4510-be6a-ef16c4311765-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e54425e5-2050-4510-be6a-ef16c4311765" (UID: "e54425e5-2050-4510-be6a-ef16c4311765"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.519031 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt46f\" (UniqueName: \"kubernetes.io/projected/e54425e5-2050-4510-be6a-ef16c4311765-kube-api-access-xt46f\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.519074 4839 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e54425e5-2050-4510-be6a-ef16c4311765-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.519083 4839 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e54425e5-2050-4510-be6a-ef16c4311765-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.975826 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" event={"ID":"e54425e5-2050-4510-be6a-ef16c4311765","Type":"ContainerDied","Data":"9f5822211a0a0dc8a462f22ebe73630d12cd9c388d680042f0d3c5791a530c45"} Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.975868 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f5822211a0a0dc8a462f22ebe73630d12cd9c388d680042f0d3c5791a530c45" Mar 21 05:15:03 crc kubenswrapper[4839]: I0321 05:15:03.975925 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567835-8vmjd" Mar 21 05:15:04 crc kubenswrapper[4839]: I0321 05:15:04.430478 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr"] Mar 21 05:15:04 crc kubenswrapper[4839]: I0321 05:15:04.443198 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567790-knjwr"] Mar 21 05:15:04 crc kubenswrapper[4839]: I0321 05:15:04.461969 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fd65835-5c51-49a6-8e2f-9ac9569c2c64" path="/var/lib/kubelet/pods/0fd65835-5c51-49a6-8e2f-9ac9569c2c64/volumes" Mar 21 05:15:56 crc kubenswrapper[4839]: I0321 05:15:56.686817 4839 scope.go:117] "RemoveContainer" containerID="d1742f96e69ee0f8c2f73ccffb16323bc9bae63d20c55bd829c98946a612539f" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.158010 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567836-7nx7s"] Mar 21 05:16:00 crc kubenswrapper[4839]: E0321 05:16:00.159351 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e54425e5-2050-4510-be6a-ef16c4311765" containerName="collect-profiles" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.159380 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54425e5-2050-4510-be6a-ef16c4311765" containerName="collect-profiles" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.159708 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="e54425e5-2050-4510-be6a-ef16c4311765" containerName="collect-profiles" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.160832 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567836-7nx7s" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.163319 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.163464 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.163535 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.180609 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567836-7nx7s"] Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.276033 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q42dv\" (UniqueName: \"kubernetes.io/projected/bcf873d9-ae04-40eb-b855-cca2a045773c-kube-api-access-q42dv\") pod \"auto-csr-approver-29567836-7nx7s\" (UID: \"bcf873d9-ae04-40eb-b855-cca2a045773c\") " pod="openshift-infra/auto-csr-approver-29567836-7nx7s" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.378326 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q42dv\" (UniqueName: \"kubernetes.io/projected/bcf873d9-ae04-40eb-b855-cca2a045773c-kube-api-access-q42dv\") pod \"auto-csr-approver-29567836-7nx7s\" (UID: \"bcf873d9-ae04-40eb-b855-cca2a045773c\") " pod="openshift-infra/auto-csr-approver-29567836-7nx7s" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.409821 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q42dv\" (UniqueName: \"kubernetes.io/projected/bcf873d9-ae04-40eb-b855-cca2a045773c-kube-api-access-q42dv\") pod \"auto-csr-approver-29567836-7nx7s\" (UID: \"bcf873d9-ae04-40eb-b855-cca2a045773c\") " pod="openshift-infra/auto-csr-approver-29567836-7nx7s" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.479738 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567836-7nx7s" Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.979902 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:16:00 crc kubenswrapper[4839]: I0321 05:16:00.980460 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:16:01 crc kubenswrapper[4839]: I0321 05:16:01.017210 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567836-7nx7s"] Mar 21 05:16:01 crc kubenswrapper[4839]: I0321 05:16:01.512172 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567836-7nx7s" event={"ID":"bcf873d9-ae04-40eb-b855-cca2a045773c","Type":"ContainerStarted","Data":"74ccf6b6cf340e9b811448396311ae52b1038c7de266d673f1fab4fccbc8dc87"} Mar 21 05:16:03 crc kubenswrapper[4839]: I0321 05:16:03.529849 4839 generic.go:334] "Generic (PLEG): container finished" podID="bcf873d9-ae04-40eb-b855-cca2a045773c" containerID="7a58fda260a0b556c155576b775648d85fe42364027e5d171170d3cfbd959f32" exitCode=0 Mar 21 05:16:03 crc kubenswrapper[4839]: I0321 05:16:03.529915 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567836-7nx7s" event={"ID":"bcf873d9-ae04-40eb-b855-cca2a045773c","Type":"ContainerDied","Data":"7a58fda260a0b556c155576b775648d85fe42364027e5d171170d3cfbd959f32"} Mar 21 05:16:04 crc kubenswrapper[4839]: I0321 05:16:04.936539 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567836-7nx7s" Mar 21 05:16:05 crc kubenswrapper[4839]: I0321 05:16:05.075201 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q42dv\" (UniqueName: \"kubernetes.io/projected/bcf873d9-ae04-40eb-b855-cca2a045773c-kube-api-access-q42dv\") pod \"bcf873d9-ae04-40eb-b855-cca2a045773c\" (UID: \"bcf873d9-ae04-40eb-b855-cca2a045773c\") " Mar 21 05:16:05 crc kubenswrapper[4839]: I0321 05:16:05.080808 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcf873d9-ae04-40eb-b855-cca2a045773c-kube-api-access-q42dv" (OuterVolumeSpecName: "kube-api-access-q42dv") pod "bcf873d9-ae04-40eb-b855-cca2a045773c" (UID: "bcf873d9-ae04-40eb-b855-cca2a045773c"). InnerVolumeSpecName "kube-api-access-q42dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:16:05 crc kubenswrapper[4839]: I0321 05:16:05.177912 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q42dv\" (UniqueName: \"kubernetes.io/projected/bcf873d9-ae04-40eb-b855-cca2a045773c-kube-api-access-q42dv\") on node \"crc\" DevicePath \"\"" Mar 21 05:16:05 crc kubenswrapper[4839]: I0321 05:16:05.549645 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567836-7nx7s" event={"ID":"bcf873d9-ae04-40eb-b855-cca2a045773c","Type":"ContainerDied","Data":"74ccf6b6cf340e9b811448396311ae52b1038c7de266d673f1fab4fccbc8dc87"} Mar 21 05:16:05 crc kubenswrapper[4839]: I0321 05:16:05.549985 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74ccf6b6cf340e9b811448396311ae52b1038c7de266d673f1fab4fccbc8dc87" Mar 21 05:16:05 crc kubenswrapper[4839]: I0321 05:16:05.549910 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567836-7nx7s" Mar 21 05:16:06 crc kubenswrapper[4839]: I0321 05:16:06.016088 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567830-8w2nt"] Mar 21 05:16:06 crc kubenswrapper[4839]: I0321 05:16:06.027074 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567830-8w2nt"] Mar 21 05:16:06 crc kubenswrapper[4839]: I0321 05:16:06.464464 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aec2b9da-f24b-47bb-95cc-2903624e2eb1" path="/var/lib/kubelet/pods/aec2b9da-f24b-47bb-95cc-2903624e2eb1/volumes" Mar 21 05:16:30 crc kubenswrapper[4839]: I0321 05:16:30.980987 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:16:30 crc kubenswrapper[4839]: I0321 05:16:30.982070 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:16:56 crc kubenswrapper[4839]: I0321 05:16:56.747368 4839 scope.go:117] "RemoveContainer" containerID="3edbd65acb10c2e93822fc30bbb912980fc1145d300030a3f42acd4b77e841c2" Mar 21 05:17:00 crc kubenswrapper[4839]: I0321 05:17:00.979954 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:17:00 crc kubenswrapper[4839]: I0321 05:17:00.980330 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:17:00 crc kubenswrapper[4839]: I0321 05:17:00.980390 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 05:17:00 crc kubenswrapper[4839]: I0321 05:17:00.981357 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:17:00 crc kubenswrapper[4839]: I0321 05:17:00.981447 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" gracePeriod=600 Mar 21 05:17:01 crc kubenswrapper[4839]: E0321 05:17:01.146453 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:17:02 crc kubenswrapper[4839]: I0321 05:17:02.088407 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" exitCode=0 Mar 21 05:17:02 crc kubenswrapper[4839]: I0321 05:17:02.088506 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2"} Mar 21 05:17:02 crc kubenswrapper[4839]: I0321 05:17:02.088804 4839 scope.go:117] "RemoveContainer" containerID="b3b7e174e088400e045d778d09a66442ae3aba7106eba8585b97c2b9d3b1ab21" Mar 21 05:17:02 crc kubenswrapper[4839]: I0321 05:17:02.089608 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:17:02 crc kubenswrapper[4839]: E0321 05:17:02.089890 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:17:14 crc kubenswrapper[4839]: I0321 05:17:14.452739 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:17:14 crc kubenswrapper[4839]: E0321 05:17:14.453590 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:17:25 crc kubenswrapper[4839]: I0321 05:17:25.452831 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:17:25 crc kubenswrapper[4839]: E0321 05:17:25.453611 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:17:36 crc kubenswrapper[4839]: I0321 05:17:36.463677 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:17:36 crc kubenswrapper[4839]: E0321 05:17:36.464801 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:17:47 crc kubenswrapper[4839]: I0321 05:17:47.453493 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:17:47 crc kubenswrapper[4839]: E0321 05:17:47.454354 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.185364 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567838-z4kh5"] Mar 21 05:18:00 crc kubenswrapper[4839]: E0321 05:18:00.186280 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf873d9-ae04-40eb-b855-cca2a045773c" containerName="oc" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.186294 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf873d9-ae04-40eb-b855-cca2a045773c" containerName="oc" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.186541 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf873d9-ae04-40eb-b855-cca2a045773c" containerName="oc" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.187201 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567838-z4kh5" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.189082 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.189293 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.189424 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.203361 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567838-z4kh5"] Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.280803 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4lx8\" (UniqueName: \"kubernetes.io/projected/a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a-kube-api-access-c4lx8\") pod \"auto-csr-approver-29567838-z4kh5\" (UID: \"a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a\") " pod="openshift-infra/auto-csr-approver-29567838-z4kh5" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.382976 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4lx8\" (UniqueName: \"kubernetes.io/projected/a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a-kube-api-access-c4lx8\") pod \"auto-csr-approver-29567838-z4kh5\" (UID: \"a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a\") " pod="openshift-infra/auto-csr-approver-29567838-z4kh5" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.402836 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4lx8\" (UniqueName: \"kubernetes.io/projected/a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a-kube-api-access-c4lx8\") pod \"auto-csr-approver-29567838-z4kh5\" (UID: \"a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a\") " pod="openshift-infra/auto-csr-approver-29567838-z4kh5" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.518722 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567838-z4kh5" Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.959480 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567838-z4kh5"] Mar 21 05:18:00 crc kubenswrapper[4839]: I0321 05:18:00.975974 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:18:01 crc kubenswrapper[4839]: I0321 05:18:01.453496 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:18:01 crc kubenswrapper[4839]: E0321 05:18:01.454032 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:18:01 crc kubenswrapper[4839]: I0321 05:18:01.643704 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567838-z4kh5" event={"ID":"a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a","Type":"ContainerStarted","Data":"96b842525df8f60fe6b9cf7189b28d7b23df5df7c8156175f571aa55d2866ae4"} Mar 21 05:18:03 crc kubenswrapper[4839]: I0321 05:18:03.661626 4839 generic.go:334] "Generic (PLEG): container finished" podID="a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a" containerID="aa966754301c031bb6355b4136e2fe214f5819ce3ea77c126ebfb20a4377b523" exitCode=0 Mar 21 05:18:03 crc kubenswrapper[4839]: I0321 05:18:03.661708 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567838-z4kh5" event={"ID":"a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a","Type":"ContainerDied","Data":"aa966754301c031bb6355b4136e2fe214f5819ce3ea77c126ebfb20a4377b523"} Mar 21 05:18:05 crc kubenswrapper[4839]: I0321 05:18:05.107189 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567838-z4kh5" Mar 21 05:18:05 crc kubenswrapper[4839]: I0321 05:18:05.181429 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4lx8\" (UniqueName: \"kubernetes.io/projected/a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a-kube-api-access-c4lx8\") pod \"a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a\" (UID: \"a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a\") " Mar 21 05:18:05 crc kubenswrapper[4839]: I0321 05:18:05.186811 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a-kube-api-access-c4lx8" (OuterVolumeSpecName: "kube-api-access-c4lx8") pod "a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a" (UID: "a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a"). InnerVolumeSpecName "kube-api-access-c4lx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:18:05 crc kubenswrapper[4839]: I0321 05:18:05.283855 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4lx8\" (UniqueName: \"kubernetes.io/projected/a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a-kube-api-access-c4lx8\") on node \"crc\" DevicePath \"\"" Mar 21 05:18:05 crc kubenswrapper[4839]: I0321 05:18:05.679770 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567838-z4kh5" event={"ID":"a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a","Type":"ContainerDied","Data":"96b842525df8f60fe6b9cf7189b28d7b23df5df7c8156175f571aa55d2866ae4"} Mar 21 05:18:05 crc kubenswrapper[4839]: I0321 05:18:05.679810 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96b842525df8f60fe6b9cf7189b28d7b23df5df7c8156175f571aa55d2866ae4" Mar 21 05:18:05 crc kubenswrapper[4839]: I0321 05:18:05.679833 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567838-z4kh5" Mar 21 05:18:06 crc kubenswrapper[4839]: I0321 05:18:06.176096 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567832-p2sbz"] Mar 21 05:18:06 crc kubenswrapper[4839]: I0321 05:18:06.185336 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567832-p2sbz"] Mar 21 05:18:06 crc kubenswrapper[4839]: I0321 05:18:06.463409 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82138c0f-eab6-4265-8db8-a8a1d934493a" path="/var/lib/kubelet/pods/82138c0f-eab6-4265-8db8-a8a1d934493a/volumes" Mar 21 05:18:15 crc kubenswrapper[4839]: I0321 05:18:15.453135 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:18:15 crc kubenswrapper[4839]: E0321 05:18:15.453852 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:18:29 crc kubenswrapper[4839]: I0321 05:18:29.453950 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:18:29 crc kubenswrapper[4839]: E0321 05:18:29.455188 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:18:43 crc kubenswrapper[4839]: I0321 05:18:43.452515 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:18:43 crc kubenswrapper[4839]: E0321 05:18:43.453475 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:18:54 crc kubenswrapper[4839]: I0321 05:18:54.453784 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:18:54 crc kubenswrapper[4839]: E0321 05:18:54.455364 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:18:56 crc kubenswrapper[4839]: I0321 05:18:56.838260 4839 scope.go:117] "RemoveContainer" containerID="3318757ff7e10c85856bb47e2aa55c12ee4b6d281f642a8f4e0f9c93681e4785" Mar 21 05:19:06 crc kubenswrapper[4839]: I0321 05:19:06.458383 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:19:06 crc kubenswrapper[4839]: E0321 05:19:06.459200 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:19:20 crc kubenswrapper[4839]: I0321 05:19:20.452867 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:19:20 crc kubenswrapper[4839]: E0321 05:19:20.453683 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:19:34 crc kubenswrapper[4839]: I0321 05:19:34.453528 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:19:34 crc kubenswrapper[4839]: E0321 05:19:34.454849 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.438103 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l4vch"] Mar 21 05:19:45 crc kubenswrapper[4839]: E0321 05:19:45.439122 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a" containerName="oc" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.439139 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a" containerName="oc" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.439335 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a" containerName="oc" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.440719 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.467460 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l4vch"] Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.580008 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-catalog-content\") pod \"redhat-operators-l4vch\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.580249 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5vn5\" (UniqueName: \"kubernetes.io/projected/f723037e-e37c-4441-b840-6a3da3ec2fff-kube-api-access-h5vn5\") pod \"redhat-operators-l4vch\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.580313 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-utilities\") pod \"redhat-operators-l4vch\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.682030 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-catalog-content\") pod \"redhat-operators-l4vch\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.682180 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5vn5\" (UniqueName: \"kubernetes.io/projected/f723037e-e37c-4441-b840-6a3da3ec2fff-kube-api-access-h5vn5\") pod \"redhat-operators-l4vch\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.682216 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-utilities\") pod \"redhat-operators-l4vch\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.682546 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-catalog-content\") pod \"redhat-operators-l4vch\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.682610 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-utilities\") pod \"redhat-operators-l4vch\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.709276 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5vn5\" (UniqueName: \"kubernetes.io/projected/f723037e-e37c-4441-b840-6a3da3ec2fff-kube-api-access-h5vn5\") pod \"redhat-operators-l4vch\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:45 crc kubenswrapper[4839]: I0321 05:19:45.763092 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:46 crc kubenswrapper[4839]: I0321 05:19:46.303430 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l4vch"] Mar 21 05:19:46 crc kubenswrapper[4839]: I0321 05:19:46.751454 4839 generic.go:334] "Generic (PLEG): container finished" podID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerID="36e58fa15812a6897b50d6db0ed7a274a81303b9dec23efb000319a8d9a33254" exitCode=0 Mar 21 05:19:46 crc kubenswrapper[4839]: I0321 05:19:46.751510 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4vch" event={"ID":"f723037e-e37c-4441-b840-6a3da3ec2fff","Type":"ContainerDied","Data":"36e58fa15812a6897b50d6db0ed7a274a81303b9dec23efb000319a8d9a33254"} Mar 21 05:19:46 crc kubenswrapper[4839]: I0321 05:19:46.752600 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4vch" event={"ID":"f723037e-e37c-4441-b840-6a3da3ec2fff","Type":"ContainerStarted","Data":"e203f4608cf6c11b77dd0d150de856a5e47df3e1fd1ef0fba1ab2bfc0eda5172"} Mar 21 05:19:48 crc kubenswrapper[4839]: I0321 05:19:48.454105 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:19:48 crc kubenswrapper[4839]: E0321 05:19:48.455659 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:19:48 crc kubenswrapper[4839]: I0321 05:19:48.772153 4839 generic.go:334] "Generic (PLEG): container finished" podID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerID="0fb7e2104f45f80289c410cb7ed43ea4ebad69cb7a12b53a2a2c205806ea1801" exitCode=0 Mar 21 05:19:48 crc kubenswrapper[4839]: I0321 05:19:48.772229 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4vch" event={"ID":"f723037e-e37c-4441-b840-6a3da3ec2fff","Type":"ContainerDied","Data":"0fb7e2104f45f80289c410cb7ed43ea4ebad69cb7a12b53a2a2c205806ea1801"} Mar 21 05:19:50 crc kubenswrapper[4839]: I0321 05:19:50.823652 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4vch" event={"ID":"f723037e-e37c-4441-b840-6a3da3ec2fff","Type":"ContainerStarted","Data":"0f0fb05b1ad7b9c8c86908bf5eed059955dab4bf2710991db435f65e1f3837bc"} Mar 21 05:19:50 crc kubenswrapper[4839]: I0321 05:19:50.895221 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l4vch" podStartSLOduration=3.034622317 podStartE2EDuration="5.895195173s" podCreationTimestamp="2026-03-21 05:19:45 +0000 UTC" firstStartedPulling="2026-03-21 05:19:46.753144875 +0000 UTC m=+3391.080931551" lastFinishedPulling="2026-03-21 05:19:49.613717731 +0000 UTC m=+3393.941504407" observedRunningTime="2026-03-21 05:19:50.841266061 +0000 UTC m=+3395.169052737" watchObservedRunningTime="2026-03-21 05:19:50.895195173 +0000 UTC m=+3395.222981849" Mar 21 05:19:55 crc kubenswrapper[4839]: I0321 05:19:55.763604 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:55 crc kubenswrapper[4839]: I0321 05:19:55.764447 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:19:56 crc kubenswrapper[4839]: I0321 05:19:56.825424 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l4vch" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="registry-server" probeResult="failure" output=< Mar 21 05:19:56 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 05:19:56 crc kubenswrapper[4839]: > Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.145135 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567840-rbk96"] Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.146918 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567840-rbk96" Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.149976 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.150495 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.150770 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.160860 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567840-rbk96"] Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.272905 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whlvb\" (UniqueName: \"kubernetes.io/projected/b89d49dc-a7f5-4a24-98c5-818fe0e99ded-kube-api-access-whlvb\") pod \"auto-csr-approver-29567840-rbk96\" (UID: \"b89d49dc-a7f5-4a24-98c5-818fe0e99ded\") " pod="openshift-infra/auto-csr-approver-29567840-rbk96" Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.375064 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whlvb\" (UniqueName: \"kubernetes.io/projected/b89d49dc-a7f5-4a24-98c5-818fe0e99ded-kube-api-access-whlvb\") pod \"auto-csr-approver-29567840-rbk96\" (UID: \"b89d49dc-a7f5-4a24-98c5-818fe0e99ded\") " pod="openshift-infra/auto-csr-approver-29567840-rbk96" Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.397893 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whlvb\" (UniqueName: \"kubernetes.io/projected/b89d49dc-a7f5-4a24-98c5-818fe0e99ded-kube-api-access-whlvb\") pod \"auto-csr-approver-29567840-rbk96\" (UID: \"b89d49dc-a7f5-4a24-98c5-818fe0e99ded\") " pod="openshift-infra/auto-csr-approver-29567840-rbk96" Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.467023 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567840-rbk96" Mar 21 05:20:00 crc kubenswrapper[4839]: I0321 05:20:00.935072 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567840-rbk96"] Mar 21 05:20:01 crc kubenswrapper[4839]: I0321 05:20:01.925237 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567840-rbk96" event={"ID":"b89d49dc-a7f5-4a24-98c5-818fe0e99ded","Type":"ContainerStarted","Data":"9b2c05d698f1afb22dc74eff02006e38c51edb53e34ee88d85a71d07b8b98779"} Mar 21 05:20:02 crc kubenswrapper[4839]: I0321 05:20:02.934852 4839 generic.go:334] "Generic (PLEG): container finished" podID="b89d49dc-a7f5-4a24-98c5-818fe0e99ded" containerID="229d14a49fd481dc353dc5d371b3e82a7f1a7396db2fffc8de8355fe9e2338cb" exitCode=0 Mar 21 05:20:02 crc kubenswrapper[4839]: I0321 05:20:02.935032 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567840-rbk96" event={"ID":"b89d49dc-a7f5-4a24-98c5-818fe0e99ded","Type":"ContainerDied","Data":"229d14a49fd481dc353dc5d371b3e82a7f1a7396db2fffc8de8355fe9e2338cb"} Mar 21 05:20:03 crc kubenswrapper[4839]: I0321 05:20:03.452905 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:20:03 crc kubenswrapper[4839]: E0321 05:20:03.453226 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:20:04 crc kubenswrapper[4839]: I0321 05:20:04.328643 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567840-rbk96" Mar 21 05:20:04 crc kubenswrapper[4839]: I0321 05:20:04.469701 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whlvb\" (UniqueName: \"kubernetes.io/projected/b89d49dc-a7f5-4a24-98c5-818fe0e99ded-kube-api-access-whlvb\") pod \"b89d49dc-a7f5-4a24-98c5-818fe0e99ded\" (UID: \"b89d49dc-a7f5-4a24-98c5-818fe0e99ded\") " Mar 21 05:20:04 crc kubenswrapper[4839]: I0321 05:20:04.476235 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b89d49dc-a7f5-4a24-98c5-818fe0e99ded-kube-api-access-whlvb" (OuterVolumeSpecName: "kube-api-access-whlvb") pod "b89d49dc-a7f5-4a24-98c5-818fe0e99ded" (UID: "b89d49dc-a7f5-4a24-98c5-818fe0e99ded"). InnerVolumeSpecName "kube-api-access-whlvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:20:04 crc kubenswrapper[4839]: I0321 05:20:04.572759 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whlvb\" (UniqueName: \"kubernetes.io/projected/b89d49dc-a7f5-4a24-98c5-818fe0e99ded-kube-api-access-whlvb\") on node \"crc\" DevicePath \"\"" Mar 21 05:20:04 crc kubenswrapper[4839]: I0321 05:20:04.953840 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567840-rbk96" event={"ID":"b89d49dc-a7f5-4a24-98c5-818fe0e99ded","Type":"ContainerDied","Data":"9b2c05d698f1afb22dc74eff02006e38c51edb53e34ee88d85a71d07b8b98779"} Mar 21 05:20:04 crc kubenswrapper[4839]: I0321 05:20:04.954027 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b2c05d698f1afb22dc74eff02006e38c51edb53e34ee88d85a71d07b8b98779" Mar 21 05:20:04 crc kubenswrapper[4839]: I0321 05:20:04.953939 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567840-rbk96" Mar 21 05:20:05 crc kubenswrapper[4839]: I0321 05:20:05.437428 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567834-zmfrq"] Mar 21 05:20:05 crc kubenswrapper[4839]: I0321 05:20:05.448509 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567834-zmfrq"] Mar 21 05:20:06 crc kubenswrapper[4839]: I0321 05:20:06.471097 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acfde7eb-12d0-4baf-9958-3ff93b290071" path="/var/lib/kubelet/pods/acfde7eb-12d0-4baf-9958-3ff93b290071/volumes" Mar 21 05:20:06 crc kubenswrapper[4839]: I0321 05:20:06.817439 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l4vch" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="registry-server" probeResult="failure" output=< Mar 21 05:20:06 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 05:20:06 crc kubenswrapper[4839]: > Mar 21 05:20:14 crc kubenswrapper[4839]: I0321 05:20:14.453636 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:20:14 crc kubenswrapper[4839]: E0321 05:20:14.454318 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:20:16 crc kubenswrapper[4839]: I0321 05:20:16.904379 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l4vch" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="registry-server" probeResult="failure" output=< Mar 21 05:20:16 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 05:20:16 crc kubenswrapper[4839]: > Mar 21 05:20:25 crc kubenswrapper[4839]: I0321 05:20:25.805039 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:20:25 crc kubenswrapper[4839]: I0321 05:20:25.854448 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:20:26 crc kubenswrapper[4839]: I0321 05:20:26.042310 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l4vch"] Mar 21 05:20:27 crc kubenswrapper[4839]: I0321 05:20:27.164081 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l4vch" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="registry-server" containerID="cri-o://0f0fb05b1ad7b9c8c86908bf5eed059955dab4bf2710991db435f65e1f3837bc" gracePeriod=2 Mar 21 05:20:27 crc kubenswrapper[4839]: I0321 05:20:27.453103 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:20:27 crc kubenswrapper[4839]: E0321 05:20:27.453381 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.177965 4839 generic.go:334] "Generic (PLEG): container finished" podID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerID="0f0fb05b1ad7b9c8c86908bf5eed059955dab4bf2710991db435f65e1f3837bc" exitCode=0 Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.178460 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4vch" event={"ID":"f723037e-e37c-4441-b840-6a3da3ec2fff","Type":"ContainerDied","Data":"0f0fb05b1ad7b9c8c86908bf5eed059955dab4bf2710991db435f65e1f3837bc"} Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.178501 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l4vch" event={"ID":"f723037e-e37c-4441-b840-6a3da3ec2fff","Type":"ContainerDied","Data":"e203f4608cf6c11b77dd0d150de856a5e47df3e1fd1ef0fba1ab2bfc0eda5172"} Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.178513 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e203f4608cf6c11b77dd0d150de856a5e47df3e1fd1ef0fba1ab2bfc0eda5172" Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.196910 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.254785 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-utilities\") pod \"f723037e-e37c-4441-b840-6a3da3ec2fff\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.255054 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5vn5\" (UniqueName: \"kubernetes.io/projected/f723037e-e37c-4441-b840-6a3da3ec2fff-kube-api-access-h5vn5\") pod \"f723037e-e37c-4441-b840-6a3da3ec2fff\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.255188 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-catalog-content\") pod \"f723037e-e37c-4441-b840-6a3da3ec2fff\" (UID: \"f723037e-e37c-4441-b840-6a3da3ec2fff\") " Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.255624 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-utilities" (OuterVolumeSpecName: "utilities") pod "f723037e-e37c-4441-b840-6a3da3ec2fff" (UID: "f723037e-e37c-4441-b840-6a3da3ec2fff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.255822 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.262942 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f723037e-e37c-4441-b840-6a3da3ec2fff-kube-api-access-h5vn5" (OuterVolumeSpecName: "kube-api-access-h5vn5") pod "f723037e-e37c-4441-b840-6a3da3ec2fff" (UID: "f723037e-e37c-4441-b840-6a3da3ec2fff"). InnerVolumeSpecName "kube-api-access-h5vn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.357905 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5vn5\" (UniqueName: \"kubernetes.io/projected/f723037e-e37c-4441-b840-6a3da3ec2fff-kube-api-access-h5vn5\") on node \"crc\" DevicePath \"\"" Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.398307 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f723037e-e37c-4441-b840-6a3da3ec2fff" (UID: "f723037e-e37c-4441-b840-6a3da3ec2fff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:20:28 crc kubenswrapper[4839]: I0321 05:20:28.459783 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f723037e-e37c-4441-b840-6a3da3ec2fff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:20:29 crc kubenswrapper[4839]: I0321 05:20:29.188050 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l4vch" Mar 21 05:20:29 crc kubenswrapper[4839]: I0321 05:20:29.224307 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l4vch"] Mar 21 05:20:29 crc kubenswrapper[4839]: I0321 05:20:29.241954 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l4vch"] Mar 21 05:20:30 crc kubenswrapper[4839]: I0321 05:20:30.464216 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" path="/var/lib/kubelet/pods/f723037e-e37c-4441-b840-6a3da3ec2fff/volumes" Mar 21 05:20:39 crc kubenswrapper[4839]: I0321 05:20:39.452992 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:20:39 crc kubenswrapper[4839]: E0321 05:20:39.453858 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:20:50 crc kubenswrapper[4839]: I0321 05:20:50.454135 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:20:50 crc kubenswrapper[4839]: E0321 05:20:50.455764 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:20:56 crc kubenswrapper[4839]: I0321 05:20:56.952435 4839 scope.go:117] "RemoveContainer" containerID="93fa052ae298171aa1d976a2185cd4fffe6a03fbc4b347a5bb68165274eaac3a" Mar 21 05:21:03 crc kubenswrapper[4839]: I0321 05:21:03.456858 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:21:03 crc kubenswrapper[4839]: E0321 05:21:03.457866 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:21:15 crc kubenswrapper[4839]: I0321 05:21:15.453603 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:21:15 crc kubenswrapper[4839]: E0321 05:21:15.454907 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:21:26 crc kubenswrapper[4839]: I0321 05:21:26.463162 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:21:26 crc kubenswrapper[4839]: E0321 05:21:26.465387 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.131972 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rgwlh"] Mar 21 05:21:34 crc kubenswrapper[4839]: E0321 05:21:34.133169 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="extract-utilities" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.133185 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="extract-utilities" Mar 21 05:21:34 crc kubenswrapper[4839]: E0321 05:21:34.133212 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="registry-server" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.133220 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="registry-server" Mar 21 05:21:34 crc kubenswrapper[4839]: E0321 05:21:34.133234 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89d49dc-a7f5-4a24-98c5-818fe0e99ded" containerName="oc" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.133241 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89d49dc-a7f5-4a24-98c5-818fe0e99ded" containerName="oc" Mar 21 05:21:34 crc kubenswrapper[4839]: E0321 05:21:34.133255 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="extract-content" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.133261 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="extract-content" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.133464 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89d49dc-a7f5-4a24-98c5-818fe0e99ded" containerName="oc" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.133477 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f723037e-e37c-4441-b840-6a3da3ec2fff" containerName="registry-server" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.135158 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.148117 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rgwlh"] Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.251732 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-utilities\") pod \"community-operators-rgwlh\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.251980 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stfwd\" (UniqueName: \"kubernetes.io/projected/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-kube-api-access-stfwd\") pod \"community-operators-rgwlh\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.252023 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-catalog-content\") pod \"community-operators-rgwlh\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.354927 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stfwd\" (UniqueName: \"kubernetes.io/projected/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-kube-api-access-stfwd\") pod \"community-operators-rgwlh\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.355022 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-catalog-content\") pod \"community-operators-rgwlh\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.355192 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-utilities\") pod \"community-operators-rgwlh\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.356251 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-utilities\") pod \"community-operators-rgwlh\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.356260 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-catalog-content\") pod \"community-operators-rgwlh\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.384414 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stfwd\" (UniqueName: \"kubernetes.io/projected/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-kube-api-access-stfwd\") pod \"community-operators-rgwlh\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:34 crc kubenswrapper[4839]: I0321 05:21:34.467223 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:35 crc kubenswrapper[4839]: I0321 05:21:35.069802 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rgwlh"] Mar 21 05:21:35 crc kubenswrapper[4839]: I0321 05:21:35.833500 4839 generic.go:334] "Generic (PLEG): container finished" podID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerID="e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712" exitCode=0 Mar 21 05:21:35 crc kubenswrapper[4839]: I0321 05:21:35.833616 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgwlh" event={"ID":"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5","Type":"ContainerDied","Data":"e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712"} Mar 21 05:21:35 crc kubenswrapper[4839]: I0321 05:21:35.834336 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgwlh" event={"ID":"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5","Type":"ContainerStarted","Data":"02d803b24cd1bb5adbe92926bb02820c11a8c095dc0491408ccacdabb5366603"} Mar 21 05:21:36 crc kubenswrapper[4839]: I0321 05:21:36.845472 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgwlh" event={"ID":"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5","Type":"ContainerStarted","Data":"4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d"} Mar 21 05:21:37 crc kubenswrapper[4839]: I0321 05:21:37.860544 4839 generic.go:334] "Generic (PLEG): container finished" podID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerID="4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d" exitCode=0 Mar 21 05:21:37 crc kubenswrapper[4839]: I0321 05:21:37.860732 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgwlh" event={"ID":"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5","Type":"ContainerDied","Data":"4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d"} Mar 21 05:21:38 crc kubenswrapper[4839]: I0321 05:21:38.453591 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:21:38 crc kubenswrapper[4839]: E0321 05:21:38.453878 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:21:38 crc kubenswrapper[4839]: I0321 05:21:38.876415 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgwlh" event={"ID":"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5","Type":"ContainerStarted","Data":"a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9"} Mar 21 05:21:44 crc kubenswrapper[4839]: I0321 05:21:44.469168 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:44 crc kubenswrapper[4839]: I0321 05:21:44.469780 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:44 crc kubenswrapper[4839]: I0321 05:21:44.527983 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:44 crc kubenswrapper[4839]: I0321 05:21:44.554561 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rgwlh" podStartSLOduration=8.065752499 podStartE2EDuration="10.554533744s" podCreationTimestamp="2026-03-21 05:21:34 +0000 UTC" firstStartedPulling="2026-03-21 05:21:35.835522589 +0000 UTC m=+3500.163309265" lastFinishedPulling="2026-03-21 05:21:38.324303834 +0000 UTC m=+3502.652090510" observedRunningTime="2026-03-21 05:21:38.898604756 +0000 UTC m=+3503.226391432" watchObservedRunningTime="2026-03-21 05:21:44.554533744 +0000 UTC m=+3508.882320420" Mar 21 05:21:44 crc kubenswrapper[4839]: I0321 05:21:44.971752 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:45 crc kubenswrapper[4839]: I0321 05:21:45.021097 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rgwlh"] Mar 21 05:21:46 crc kubenswrapper[4839]: I0321 05:21:46.940688 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rgwlh" podUID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerName="registry-server" containerID="cri-o://a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9" gracePeriod=2 Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.441760 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.636490 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-utilities\") pod \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.636642 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stfwd\" (UniqueName: \"kubernetes.io/projected/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-kube-api-access-stfwd\") pod \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.636794 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-catalog-content\") pod \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\" (UID: \"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5\") " Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.637700 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-utilities" (OuterVolumeSpecName: "utilities") pod "b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" (UID: "b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.643718 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-kube-api-access-stfwd" (OuterVolumeSpecName: "kube-api-access-stfwd") pod "b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" (UID: "b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5"). InnerVolumeSpecName "kube-api-access-stfwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.738666 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.738702 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stfwd\" (UniqueName: \"kubernetes.io/projected/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-kube-api-access-stfwd\") on node \"crc\" DevicePath \"\"" Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.953006 4839 generic.go:334] "Generic (PLEG): container finished" podID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerID="a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9" exitCode=0 Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.953111 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgwlh" event={"ID":"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5","Type":"ContainerDied","Data":"a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9"} Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.953145 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rgwlh" Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.953181 4839 scope.go:117] "RemoveContainer" containerID="a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9" Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.953163 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rgwlh" event={"ID":"b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5","Type":"ContainerDied","Data":"02d803b24cd1bb5adbe92926bb02820c11a8c095dc0491408ccacdabb5366603"} Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.972381 4839 scope.go:117] "RemoveContainer" containerID="4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d" Mar 21 05:21:47 crc kubenswrapper[4839]: I0321 05:21:47.996465 4839 scope.go:117] "RemoveContainer" containerID="e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712" Mar 21 05:21:48 crc kubenswrapper[4839]: I0321 05:21:48.039150 4839 scope.go:117] "RemoveContainer" containerID="a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9" Mar 21 05:21:48 crc kubenswrapper[4839]: E0321 05:21:48.039687 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9\": container with ID starting with a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9 not found: ID does not exist" containerID="a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9" Mar 21 05:21:48 crc kubenswrapper[4839]: I0321 05:21:48.039730 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9"} err="failed to get container status \"a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9\": rpc error: code = NotFound desc = could not find container \"a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9\": container with ID starting with a0842676442eb476e02b4782f0eccdd718a0012632aed96e0f5955327f7196c9 not found: ID does not exist" Mar 21 05:21:48 crc kubenswrapper[4839]: I0321 05:21:48.039751 4839 scope.go:117] "RemoveContainer" containerID="4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d" Mar 21 05:21:48 crc kubenswrapper[4839]: E0321 05:21:48.040334 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d\": container with ID starting with 4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d not found: ID does not exist" containerID="4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d" Mar 21 05:21:48 crc kubenswrapper[4839]: I0321 05:21:48.040440 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d"} err="failed to get container status \"4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d\": rpc error: code = NotFound desc = could not find container \"4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d\": container with ID starting with 4562cd9c8152972a69c31d00df8df7e48df986202f27782b33ec2df30111899d not found: ID does not exist" Mar 21 05:21:48 crc kubenswrapper[4839]: I0321 05:21:48.040739 4839 scope.go:117] "RemoveContainer" containerID="e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712" Mar 21 05:21:48 crc kubenswrapper[4839]: E0321 05:21:48.041303 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712\": container with ID starting with e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712 not found: ID does not exist" containerID="e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712" Mar 21 05:21:48 crc kubenswrapper[4839]: I0321 05:21:48.041337 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712"} err="failed to get container status \"e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712\": rpc error: code = NotFound desc = could not find container \"e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712\": container with ID starting with e45e155e440eadd7b16f1d84fbd4ac903a306839524ebedb5e79c97e1f5c8712 not found: ID does not exist" Mar 21 05:21:48 crc kubenswrapper[4839]: I0321 05:21:48.546166 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" (UID: "b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:21:48 crc kubenswrapper[4839]: I0321 05:21:48.555112 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:21:48 crc kubenswrapper[4839]: I0321 05:21:48.591776 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rgwlh"] Mar 21 05:21:48 crc kubenswrapper[4839]: I0321 05:21:48.600162 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rgwlh"] Mar 21 05:21:50 crc kubenswrapper[4839]: I0321 05:21:50.465058 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" path="/var/lib/kubelet/pods/b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5/volumes" Mar 21 05:21:53 crc kubenswrapper[4839]: I0321 05:21:53.453448 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:21:53 crc kubenswrapper[4839]: E0321 05:21:53.454373 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.142223 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567842-mkbkh"] Mar 21 05:22:00 crc kubenswrapper[4839]: E0321 05:22:00.143307 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerName="extract-utilities" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.143362 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerName="extract-utilities" Mar 21 05:22:00 crc kubenswrapper[4839]: E0321 05:22:00.143410 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerName="registry-server" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.143417 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerName="registry-server" Mar 21 05:22:00 crc kubenswrapper[4839]: E0321 05:22:00.143433 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerName="extract-content" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.143439 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerName="extract-content" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.143658 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="b39946e1-8c4a-48cc-9b4a-4ce148c3b8e5" containerName="registry-server" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.144397 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567842-mkbkh" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.146533 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.146930 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.147224 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.153401 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567842-mkbkh"] Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.183153 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsphj\" (UniqueName: \"kubernetes.io/projected/98d91ef7-84b2-40fa-b268-b3a42085ecbd-kube-api-access-bsphj\") pod \"auto-csr-approver-29567842-mkbkh\" (UID: \"98d91ef7-84b2-40fa-b268-b3a42085ecbd\") " pod="openshift-infra/auto-csr-approver-29567842-mkbkh" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.284804 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsphj\" (UniqueName: \"kubernetes.io/projected/98d91ef7-84b2-40fa-b268-b3a42085ecbd-kube-api-access-bsphj\") pod \"auto-csr-approver-29567842-mkbkh\" (UID: \"98d91ef7-84b2-40fa-b268-b3a42085ecbd\") " pod="openshift-infra/auto-csr-approver-29567842-mkbkh" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.315065 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsphj\" (UniqueName: \"kubernetes.io/projected/98d91ef7-84b2-40fa-b268-b3a42085ecbd-kube-api-access-bsphj\") pod \"auto-csr-approver-29567842-mkbkh\" (UID: \"98d91ef7-84b2-40fa-b268-b3a42085ecbd\") " pod="openshift-infra/auto-csr-approver-29567842-mkbkh" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.480162 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567842-mkbkh" Mar 21 05:22:00 crc kubenswrapper[4839]: I0321 05:22:00.981289 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567842-mkbkh"] Mar 21 05:22:01 crc kubenswrapper[4839]: I0321 05:22:01.078014 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567842-mkbkh" event={"ID":"98d91ef7-84b2-40fa-b268-b3a42085ecbd","Type":"ContainerStarted","Data":"674344555b6619a3e458790d8e4b299fcb88c8a29df166d1bd678fbe24712979"} Mar 21 05:22:03 crc kubenswrapper[4839]: I0321 05:22:03.101887 4839 generic.go:334] "Generic (PLEG): container finished" podID="98d91ef7-84b2-40fa-b268-b3a42085ecbd" containerID="eedac11fd6ab65c2edef01ab4734c0cb94cc7a431c8be7bf1c6cc4417de55aa3" exitCode=0 Mar 21 05:22:03 crc kubenswrapper[4839]: I0321 05:22:03.102116 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567842-mkbkh" event={"ID":"98d91ef7-84b2-40fa-b268-b3a42085ecbd","Type":"ContainerDied","Data":"eedac11fd6ab65c2edef01ab4734c0cb94cc7a431c8be7bf1c6cc4417de55aa3"} Mar 21 05:22:04 crc kubenswrapper[4839]: I0321 05:22:04.570991 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567842-mkbkh" Mar 21 05:22:04 crc kubenswrapper[4839]: I0321 05:22:04.672672 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsphj\" (UniqueName: \"kubernetes.io/projected/98d91ef7-84b2-40fa-b268-b3a42085ecbd-kube-api-access-bsphj\") pod \"98d91ef7-84b2-40fa-b268-b3a42085ecbd\" (UID: \"98d91ef7-84b2-40fa-b268-b3a42085ecbd\") " Mar 21 05:22:04 crc kubenswrapper[4839]: I0321 05:22:04.694758 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d91ef7-84b2-40fa-b268-b3a42085ecbd-kube-api-access-bsphj" (OuterVolumeSpecName: "kube-api-access-bsphj") pod "98d91ef7-84b2-40fa-b268-b3a42085ecbd" (UID: "98d91ef7-84b2-40fa-b268-b3a42085ecbd"). InnerVolumeSpecName "kube-api-access-bsphj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:22:04 crc kubenswrapper[4839]: I0321 05:22:04.776458 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsphj\" (UniqueName: \"kubernetes.io/projected/98d91ef7-84b2-40fa-b268-b3a42085ecbd-kube-api-access-bsphj\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:05 crc kubenswrapper[4839]: I0321 05:22:05.125002 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567842-mkbkh" event={"ID":"98d91ef7-84b2-40fa-b268-b3a42085ecbd","Type":"ContainerDied","Data":"674344555b6619a3e458790d8e4b299fcb88c8a29df166d1bd678fbe24712979"} Mar 21 05:22:05 crc kubenswrapper[4839]: I0321 05:22:05.125536 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="674344555b6619a3e458790d8e4b299fcb88c8a29df166d1bd678fbe24712979" Mar 21 05:22:05 crc kubenswrapper[4839]: I0321 05:22:05.125111 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567842-mkbkh" Mar 21 05:22:05 crc kubenswrapper[4839]: I0321 05:22:05.452961 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:22:05 crc kubenswrapper[4839]: I0321 05:22:05.660163 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567836-7nx7s"] Mar 21 05:22:05 crc kubenswrapper[4839]: I0321 05:22:05.674205 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567836-7nx7s"] Mar 21 05:22:06 crc kubenswrapper[4839]: I0321 05:22:06.138506 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"26d5ad8d8c206d8ada93506f3a162dccbd9846e40dd3da26db34bab6bbf70437"} Mar 21 05:22:06 crc kubenswrapper[4839]: I0321 05:22:06.464902 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcf873d9-ae04-40eb-b855-cca2a045773c" path="/var/lib/kubelet/pods/bcf873d9-ae04-40eb-b855-cca2a045773c/volumes" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.530384 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wzl7t"] Mar 21 05:22:24 crc kubenswrapper[4839]: E0321 05:22:24.531544 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d91ef7-84b2-40fa-b268-b3a42085ecbd" containerName="oc" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.531581 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d91ef7-84b2-40fa-b268-b3a42085ecbd" containerName="oc" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.531884 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="98d91ef7-84b2-40fa-b268-b3a42085ecbd" containerName="oc" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.534557 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.548776 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzl7t"] Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.701446 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmdmg\" (UniqueName: \"kubernetes.io/projected/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-kube-api-access-mmdmg\") pod \"redhat-marketplace-wzl7t\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.701923 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-catalog-content\") pod \"redhat-marketplace-wzl7t\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.701956 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-utilities\") pod \"redhat-marketplace-wzl7t\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.803489 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-catalog-content\") pod \"redhat-marketplace-wzl7t\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.803554 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-utilities\") pod \"redhat-marketplace-wzl7t\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.803667 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmdmg\" (UniqueName: \"kubernetes.io/projected/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-kube-api-access-mmdmg\") pod \"redhat-marketplace-wzl7t\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.804368 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-catalog-content\") pod \"redhat-marketplace-wzl7t\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.804505 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-utilities\") pod \"redhat-marketplace-wzl7t\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.827419 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmdmg\" (UniqueName: \"kubernetes.io/projected/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-kube-api-access-mmdmg\") pod \"redhat-marketplace-wzl7t\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:24 crc kubenswrapper[4839]: I0321 05:22:24.873521 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:25 crc kubenswrapper[4839]: I0321 05:22:25.359840 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzl7t"] Mar 21 05:22:25 crc kubenswrapper[4839]: I0321 05:22:25.583434 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzl7t" event={"ID":"ff36c9ee-5581-48b3-be29-a6d5ad4b9476","Type":"ContainerStarted","Data":"ccce4a290aff8ab3e5d1554009c1a69f2fb741ea3f5fba347059d74bed0b8ded"} Mar 21 05:22:26 crc kubenswrapper[4839]: I0321 05:22:26.593937 4839 generic.go:334] "Generic (PLEG): container finished" podID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerID="97ce872ff52632fee3002b46fbe2d1087d0acb59cd9704873c29b470d51ff9e4" exitCode=0 Mar 21 05:22:26 crc kubenswrapper[4839]: I0321 05:22:26.594016 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzl7t" event={"ID":"ff36c9ee-5581-48b3-be29-a6d5ad4b9476","Type":"ContainerDied","Data":"97ce872ff52632fee3002b46fbe2d1087d0acb59cd9704873c29b470d51ff9e4"} Mar 21 05:22:28 crc kubenswrapper[4839]: I0321 05:22:28.615561 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzl7t" event={"ID":"ff36c9ee-5581-48b3-be29-a6d5ad4b9476","Type":"ContainerStarted","Data":"fd8610a23aa4477b05f1e471927e591e3db28e8c730e3f65952a7cfd15d24ba8"} Mar 21 05:22:29 crc kubenswrapper[4839]: I0321 05:22:29.629477 4839 generic.go:334] "Generic (PLEG): container finished" podID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerID="fd8610a23aa4477b05f1e471927e591e3db28e8c730e3f65952a7cfd15d24ba8" exitCode=0 Mar 21 05:22:29 crc kubenswrapper[4839]: I0321 05:22:29.629721 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzl7t" event={"ID":"ff36c9ee-5581-48b3-be29-a6d5ad4b9476","Type":"ContainerDied","Data":"fd8610a23aa4477b05f1e471927e591e3db28e8c730e3f65952a7cfd15d24ba8"} Mar 21 05:22:31 crc kubenswrapper[4839]: I0321 05:22:31.657952 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzl7t" event={"ID":"ff36c9ee-5581-48b3-be29-a6d5ad4b9476","Type":"ContainerStarted","Data":"65a7d17f89d1557c72c5ffd06bb72faeb0e67e3bd8925184b20df8ed6afa7a8d"} Mar 21 05:22:31 crc kubenswrapper[4839]: I0321 05:22:31.685754 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wzl7t" podStartSLOduration=3.84304255 podStartE2EDuration="7.685730332s" podCreationTimestamp="2026-03-21 05:22:24 +0000 UTC" firstStartedPulling="2026-03-21 05:22:26.59679398 +0000 UTC m=+3550.924580656" lastFinishedPulling="2026-03-21 05:22:30.439481762 +0000 UTC m=+3554.767268438" observedRunningTime="2026-03-21 05:22:31.678667383 +0000 UTC m=+3556.006454069" watchObservedRunningTime="2026-03-21 05:22:31.685730332 +0000 UTC m=+3556.013517008" Mar 21 05:22:34 crc kubenswrapper[4839]: I0321 05:22:34.875158 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:34 crc kubenswrapper[4839]: I0321 05:22:34.875871 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:34 crc kubenswrapper[4839]: I0321 05:22:34.929141 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:35 crc kubenswrapper[4839]: I0321 05:22:35.757763 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:35 crc kubenswrapper[4839]: I0321 05:22:35.821705 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzl7t"] Mar 21 05:22:37 crc kubenswrapper[4839]: I0321 05:22:37.728313 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wzl7t" podUID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerName="registry-server" containerID="cri-o://65a7d17f89d1557c72c5ffd06bb72faeb0e67e3bd8925184b20df8ed6afa7a8d" gracePeriod=2 Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.741975 4839 generic.go:334] "Generic (PLEG): container finished" podID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerID="65a7d17f89d1557c72c5ffd06bb72faeb0e67e3bd8925184b20df8ed6afa7a8d" exitCode=0 Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.742022 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzl7t" event={"ID":"ff36c9ee-5581-48b3-be29-a6d5ad4b9476","Type":"ContainerDied","Data":"65a7d17f89d1557c72c5ffd06bb72faeb0e67e3bd8925184b20df8ed6afa7a8d"} Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.742373 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzl7t" event={"ID":"ff36c9ee-5581-48b3-be29-a6d5ad4b9476","Type":"ContainerDied","Data":"ccce4a290aff8ab3e5d1554009c1a69f2fb741ea3f5fba347059d74bed0b8ded"} Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.742389 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccce4a290aff8ab3e5d1554009c1a69f2fb741ea3f5fba347059d74bed0b8ded" Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.777168 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.901769 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-utilities\") pod \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.901864 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmdmg\" (UniqueName: \"kubernetes.io/projected/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-kube-api-access-mmdmg\") pod \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.901966 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-catalog-content\") pod \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\" (UID: \"ff36c9ee-5581-48b3-be29-a6d5ad4b9476\") " Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.903352 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-utilities" (OuterVolumeSpecName: "utilities") pod "ff36c9ee-5581-48b3-be29-a6d5ad4b9476" (UID: "ff36c9ee-5581-48b3-be29-a6d5ad4b9476"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.910501 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-kube-api-access-mmdmg" (OuterVolumeSpecName: "kube-api-access-mmdmg") pod "ff36c9ee-5581-48b3-be29-a6d5ad4b9476" (UID: "ff36c9ee-5581-48b3-be29-a6d5ad4b9476"). InnerVolumeSpecName "kube-api-access-mmdmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:22:38 crc kubenswrapper[4839]: I0321 05:22:38.928183 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff36c9ee-5581-48b3-be29-a6d5ad4b9476" (UID: "ff36c9ee-5581-48b3-be29-a6d5ad4b9476"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:22:39 crc kubenswrapper[4839]: I0321 05:22:39.004604 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:39 crc kubenswrapper[4839]: I0321 05:22:39.004649 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:39 crc kubenswrapper[4839]: I0321 05:22:39.004658 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmdmg\" (UniqueName: \"kubernetes.io/projected/ff36c9ee-5581-48b3-be29-a6d5ad4b9476-kube-api-access-mmdmg\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:39 crc kubenswrapper[4839]: I0321 05:22:39.751250 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wzl7t" Mar 21 05:22:39 crc kubenswrapper[4839]: I0321 05:22:39.793492 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzl7t"] Mar 21 05:22:39 crc kubenswrapper[4839]: I0321 05:22:39.809000 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzl7t"] Mar 21 05:22:40 crc kubenswrapper[4839]: I0321 05:22:40.464311 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" path="/var/lib/kubelet/pods/ff36c9ee-5581-48b3-be29-a6d5ad4b9476/volumes" Mar 21 05:22:57 crc kubenswrapper[4839]: I0321 05:22:57.245551 4839 scope.go:117] "RemoveContainer" containerID="7a58fda260a0b556c155576b775648d85fe42364027e5d171170d3cfbd959f32" Mar 21 05:22:57 crc kubenswrapper[4839]: I0321 05:22:57.917520 4839 generic.go:334] "Generic (PLEG): container finished" podID="65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" containerID="0176599a8a2d6c5f1b857f924207691c2463c8d61ed3270470c8fc3d29535c3b" exitCode=0 Mar 21 05:22:57 crc kubenswrapper[4839]: I0321 05:22:57.917675 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3","Type":"ContainerDied","Data":"0176599a8a2d6c5f1b857f924207691c2463c8d61ed3270470c8fc3d29535c3b"} Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.329164 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.435808 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-config-data\") pod \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.435901 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-temporary\") pod \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.435984 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config\") pod \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.436049 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x84s5\" (UniqueName: \"kubernetes.io/projected/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-kube-api-access-x84s5\") pod \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.436091 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-workdir\") pod \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.436211 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config-secret\") pod \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.436235 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ssh-key\") pod \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.436271 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.436305 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ca-certs\") pod \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\" (UID: \"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3\") " Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.437257 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" (UID: "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.437007 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-config-data" (OuterVolumeSpecName: "config-data") pod "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" (UID: "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.438070 4839 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-config-data\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.438100 4839 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.442918 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" (UID: "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.444222 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" (UID: "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.448845 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-kube-api-access-x84s5" (OuterVolumeSpecName: "kube-api-access-x84s5") pod "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" (UID: "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3"). InnerVolumeSpecName "kube-api-access-x84s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.474974 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" (UID: "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.480297 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" (UID: "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.480460 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" (UID: "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.502637 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" (UID: "65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.545093 4839 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.545127 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x84s5\" (UniqueName: \"kubernetes.io/projected/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-kube-api-access-x84s5\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.545140 4839 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.545149 4839 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.545158 4839 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.545177 4839 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.545186 4839 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.627480 4839 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.647015 4839 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.945994 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3","Type":"ContainerDied","Data":"de55efea3ef0459f6ae11516d74916a8089be737bb24fdba8c8fcffa3719ebe6"} Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.946097 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de55efea3ef0459f6ae11516d74916a8089be737bb24fdba8c8fcffa3719ebe6" Mar 21 05:22:59 crc kubenswrapper[4839]: I0321 05:22:59.946058 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.553909 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 21 05:23:09 crc kubenswrapper[4839]: E0321 05:23:09.554850 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerName="extract-content" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.554865 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerName="extract-content" Mar 21 05:23:09 crc kubenswrapper[4839]: E0321 05:23:09.554888 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerName="extract-utilities" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.554896 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerName="extract-utilities" Mar 21 05:23:09 crc kubenswrapper[4839]: E0321 05:23:09.554920 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" containerName="tempest-tests-tempest-tests-runner" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.554929 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" containerName="tempest-tests-tempest-tests-runner" Mar 21 05:23:09 crc kubenswrapper[4839]: E0321 05:23:09.554955 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerName="registry-server" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.554962 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerName="registry-server" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.555144 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff36c9ee-5581-48b3-be29-a6d5ad4b9476" containerName="registry-server" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.555162 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3" containerName="tempest-tests-tempest-tests-runner" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.555851 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.559676 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-6v5zd" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.580338 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.743241 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4qk2\" (UniqueName: \"kubernetes.io/projected/d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8-kube-api-access-b4qk2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.743932 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.845487 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4qk2\" (UniqueName: \"kubernetes.io/projected/d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8-kube-api-access-b4qk2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.845680 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.846141 4839 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.871273 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4qk2\" (UniqueName: \"kubernetes.io/projected/d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8-kube-api-access-b4qk2\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.871940 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:23:09 crc kubenswrapper[4839]: I0321 05:23:09.938535 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 21 05:23:10 crc kubenswrapper[4839]: I0321 05:23:10.429228 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 21 05:23:10 crc kubenswrapper[4839]: I0321 05:23:10.435599 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:23:11 crc kubenswrapper[4839]: I0321 05:23:11.041072 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8","Type":"ContainerStarted","Data":"f84da512d0b4ee4769cf676e54c429ce52b3bd443ddce23391de8b5f03054507"} Mar 21 05:23:15 crc kubenswrapper[4839]: I0321 05:23:15.087835 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8","Type":"ContainerStarted","Data":"4bcb391614db0784c3a6e55fa5bf5ea8e33f5a28e61318dca439f8c0bd5726b2"} Mar 21 05:23:15 crc kubenswrapper[4839]: I0321 05:23:15.108341 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.6273181340000002 podStartE2EDuration="6.108317887s" podCreationTimestamp="2026-03-21 05:23:09 +0000 UTC" firstStartedPulling="2026-03-21 05:23:10.435335246 +0000 UTC m=+3594.763121922" lastFinishedPulling="2026-03-21 05:23:13.916334999 +0000 UTC m=+3598.244121675" observedRunningTime="2026-03-21 05:23:15.098912042 +0000 UTC m=+3599.426698718" watchObservedRunningTime="2026-03-21 05:23:15.108317887 +0000 UTC m=+3599.436104563" Mar 21 05:23:40 crc kubenswrapper[4839]: I0321 05:23:40.731425 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9br9q"] Mar 21 05:23:40 crc kubenswrapper[4839]: I0321 05:23:40.749200 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9br9q"] Mar 21 05:23:40 crc kubenswrapper[4839]: I0321 05:23:40.749314 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:40 crc kubenswrapper[4839]: I0321 05:23:40.939487 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qxxr\" (UniqueName: \"kubernetes.io/projected/aaf4c7b6-9115-45ff-b52d-3e232223ae62-kube-api-access-6qxxr\") pod \"certified-operators-9br9q\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:40 crc kubenswrapper[4839]: I0321 05:23:40.940312 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-utilities\") pod \"certified-operators-9br9q\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:40 crc kubenswrapper[4839]: I0321 05:23:40.940391 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-catalog-content\") pod \"certified-operators-9br9q\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:41 crc kubenswrapper[4839]: I0321 05:23:41.042320 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qxxr\" (UniqueName: \"kubernetes.io/projected/aaf4c7b6-9115-45ff-b52d-3e232223ae62-kube-api-access-6qxxr\") pod \"certified-operators-9br9q\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:41 crc kubenswrapper[4839]: I0321 05:23:41.042679 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-utilities\") pod \"certified-operators-9br9q\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:41 crc kubenswrapper[4839]: I0321 05:23:41.042724 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-catalog-content\") pod \"certified-operators-9br9q\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:41 crc kubenswrapper[4839]: I0321 05:23:41.043225 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-utilities\") pod \"certified-operators-9br9q\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:41 crc kubenswrapper[4839]: I0321 05:23:41.043465 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-catalog-content\") pod \"certified-operators-9br9q\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:41 crc kubenswrapper[4839]: I0321 05:23:41.065368 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qxxr\" (UniqueName: \"kubernetes.io/projected/aaf4c7b6-9115-45ff-b52d-3e232223ae62-kube-api-access-6qxxr\") pod \"certified-operators-9br9q\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:41 crc kubenswrapper[4839]: I0321 05:23:41.077083 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:41 crc kubenswrapper[4839]: I0321 05:23:41.685828 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9br9q"] Mar 21 05:23:41 crc kubenswrapper[4839]: I0321 05:23:41.772115 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9br9q" event={"ID":"aaf4c7b6-9115-45ff-b52d-3e232223ae62","Type":"ContainerStarted","Data":"15a447ccc083b3575f68f14b0abb62b3593891a818abc131c80bd34fc5738f62"} Mar 21 05:23:42 crc kubenswrapper[4839]: I0321 05:23:42.787081 4839 generic.go:334] "Generic (PLEG): container finished" podID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerID="e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1" exitCode=0 Mar 21 05:23:42 crc kubenswrapper[4839]: I0321 05:23:42.787142 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9br9q" event={"ID":"aaf4c7b6-9115-45ff-b52d-3e232223ae62","Type":"ContainerDied","Data":"e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1"} Mar 21 05:23:43 crc kubenswrapper[4839]: I0321 05:23:43.801188 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9br9q" event={"ID":"aaf4c7b6-9115-45ff-b52d-3e232223ae62","Type":"ContainerStarted","Data":"20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884"} Mar 21 05:23:44 crc kubenswrapper[4839]: I0321 05:23:44.813941 4839 generic.go:334] "Generic (PLEG): container finished" podID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerID="20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884" exitCode=0 Mar 21 05:23:44 crc kubenswrapper[4839]: I0321 05:23:44.814061 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9br9q" event={"ID":"aaf4c7b6-9115-45ff-b52d-3e232223ae62","Type":"ContainerDied","Data":"20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884"} Mar 21 05:23:46 crc kubenswrapper[4839]: I0321 05:23:46.841922 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9br9q" event={"ID":"aaf4c7b6-9115-45ff-b52d-3e232223ae62","Type":"ContainerStarted","Data":"b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141"} Mar 21 05:23:46 crc kubenswrapper[4839]: I0321 05:23:46.868856 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9br9q" podStartSLOduration=3.139453393 podStartE2EDuration="6.868830878s" podCreationTimestamp="2026-03-21 05:23:40 +0000 UTC" firstStartedPulling="2026-03-21 05:23:42.78933967 +0000 UTC m=+3627.117126346" lastFinishedPulling="2026-03-21 05:23:46.518717155 +0000 UTC m=+3630.846503831" observedRunningTime="2026-03-21 05:23:46.861648006 +0000 UTC m=+3631.189434682" watchObservedRunningTime="2026-03-21 05:23:46.868830878 +0000 UTC m=+3631.196617544" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.077684 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.078446 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.129850 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.574958 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-72qrq/must-gather-mxjpl"] Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.576907 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/must-gather-mxjpl" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.579763 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-72qrq"/"default-dockercfg-b7xd9" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.588594 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-72qrq"/"openshift-service-ca.crt" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.588888 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-72qrq"/"kube-root-ca.crt" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.609617 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-72qrq/must-gather-mxjpl"] Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.731191 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4s42\" (UniqueName: \"kubernetes.io/projected/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-kube-api-access-t4s42\") pod \"must-gather-mxjpl\" (UID: \"de78e0a8-6c32-44ae-8f44-443eb0f1dd25\") " pod="openshift-must-gather-72qrq/must-gather-mxjpl" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.731883 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-must-gather-output\") pod \"must-gather-mxjpl\" (UID: \"de78e0a8-6c32-44ae-8f44-443eb0f1dd25\") " pod="openshift-must-gather-72qrq/must-gather-mxjpl" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.833794 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4s42\" (UniqueName: \"kubernetes.io/projected/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-kube-api-access-t4s42\") pod \"must-gather-mxjpl\" (UID: \"de78e0a8-6c32-44ae-8f44-443eb0f1dd25\") " pod="openshift-must-gather-72qrq/must-gather-mxjpl" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.833916 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-must-gather-output\") pod \"must-gather-mxjpl\" (UID: \"de78e0a8-6c32-44ae-8f44-443eb0f1dd25\") " pod="openshift-must-gather-72qrq/must-gather-mxjpl" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.834465 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-must-gather-output\") pod \"must-gather-mxjpl\" (UID: \"de78e0a8-6c32-44ae-8f44-443eb0f1dd25\") " pod="openshift-must-gather-72qrq/must-gather-mxjpl" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.852483 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4s42\" (UniqueName: \"kubernetes.io/projected/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-kube-api-access-t4s42\") pod \"must-gather-mxjpl\" (UID: \"de78e0a8-6c32-44ae-8f44-443eb0f1dd25\") " pod="openshift-must-gather-72qrq/must-gather-mxjpl" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.904419 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/must-gather-mxjpl" Mar 21 05:23:51 crc kubenswrapper[4839]: I0321 05:23:51.934823 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:52 crc kubenswrapper[4839]: I0321 05:23:52.008457 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9br9q"] Mar 21 05:23:52 crc kubenswrapper[4839]: I0321 05:23:52.403435 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-72qrq/must-gather-mxjpl"] Mar 21 05:23:52 crc kubenswrapper[4839]: I0321 05:23:52.895010 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/must-gather-mxjpl" event={"ID":"de78e0a8-6c32-44ae-8f44-443eb0f1dd25","Type":"ContainerStarted","Data":"d623f9d6f3bcfead5a92b85d22dc41afd83ca6b9def71de722479fb2dbc37fbf"} Mar 21 05:23:53 crc kubenswrapper[4839]: I0321 05:23:53.903580 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9br9q" podUID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerName="registry-server" containerID="cri-o://b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141" gracePeriod=2 Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.396097 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.499202 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qxxr\" (UniqueName: \"kubernetes.io/projected/aaf4c7b6-9115-45ff-b52d-3e232223ae62-kube-api-access-6qxxr\") pod \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.499276 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-utilities\") pod \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.499407 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-catalog-content\") pod \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\" (UID: \"aaf4c7b6-9115-45ff-b52d-3e232223ae62\") " Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.501548 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-utilities" (OuterVolumeSpecName: "utilities") pod "aaf4c7b6-9115-45ff-b52d-3e232223ae62" (UID: "aaf4c7b6-9115-45ff-b52d-3e232223ae62"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.507534 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaf4c7b6-9115-45ff-b52d-3e232223ae62-kube-api-access-6qxxr" (OuterVolumeSpecName: "kube-api-access-6qxxr") pod "aaf4c7b6-9115-45ff-b52d-3e232223ae62" (UID: "aaf4c7b6-9115-45ff-b52d-3e232223ae62"). InnerVolumeSpecName "kube-api-access-6qxxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.574230 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aaf4c7b6-9115-45ff-b52d-3e232223ae62" (UID: "aaf4c7b6-9115-45ff-b52d-3e232223ae62"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.602080 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qxxr\" (UniqueName: \"kubernetes.io/projected/aaf4c7b6-9115-45ff-b52d-3e232223ae62-kube-api-access-6qxxr\") on node \"crc\" DevicePath \"\"" Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.602124 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.602139 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaf4c7b6-9115-45ff-b52d-3e232223ae62-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.914897 4839 generic.go:334] "Generic (PLEG): container finished" podID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerID="b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141" exitCode=0 Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.914953 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9br9q" event={"ID":"aaf4c7b6-9115-45ff-b52d-3e232223ae62","Type":"ContainerDied","Data":"b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141"} Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.914981 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9br9q" event={"ID":"aaf4c7b6-9115-45ff-b52d-3e232223ae62","Type":"ContainerDied","Data":"15a447ccc083b3575f68f14b0abb62b3593891a818abc131c80bd34fc5738f62"} Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.914999 4839 scope.go:117] "RemoveContainer" containerID="b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141" Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.915009 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9br9q" Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.951451 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9br9q"] Mar 21 05:23:54 crc kubenswrapper[4839]: I0321 05:23:54.959377 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9br9q"] Mar 21 05:23:56 crc kubenswrapper[4839]: I0321 05:23:56.465499 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" path="/var/lib/kubelet/pods/aaf4c7b6-9115-45ff-b52d-3e232223ae62/volumes" Mar 21 05:23:56 crc kubenswrapper[4839]: I0321 05:23:56.622821 4839 scope.go:117] "RemoveContainer" containerID="20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884" Mar 21 05:23:56 crc kubenswrapper[4839]: I0321 05:23:56.692882 4839 scope.go:117] "RemoveContainer" containerID="e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1" Mar 21 05:23:56 crc kubenswrapper[4839]: I0321 05:23:56.720908 4839 scope.go:117] "RemoveContainer" containerID="b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141" Mar 21 05:23:56 crc kubenswrapper[4839]: E0321 05:23:56.721900 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141\": container with ID starting with b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141 not found: ID does not exist" containerID="b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141" Mar 21 05:23:56 crc kubenswrapper[4839]: I0321 05:23:56.721951 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141"} err="failed to get container status \"b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141\": rpc error: code = NotFound desc = could not find container \"b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141\": container with ID starting with b587c903c28806ae060b27518769299417f133a1cceb7d65518b0aab1147a141 not found: ID does not exist" Mar 21 05:23:56 crc kubenswrapper[4839]: I0321 05:23:56.721984 4839 scope.go:117] "RemoveContainer" containerID="20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884" Mar 21 05:23:56 crc kubenswrapper[4839]: E0321 05:23:56.722395 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884\": container with ID starting with 20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884 not found: ID does not exist" containerID="20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884" Mar 21 05:23:56 crc kubenswrapper[4839]: I0321 05:23:56.722435 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884"} err="failed to get container status \"20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884\": rpc error: code = NotFound desc = could not find container \"20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884\": container with ID starting with 20c506a490bea9b182aeeed1af36acdae8097f4ff646607287d213f582f77884 not found: ID does not exist" Mar 21 05:23:56 crc kubenswrapper[4839]: I0321 05:23:56.722455 4839 scope.go:117] "RemoveContainer" containerID="e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1" Mar 21 05:23:56 crc kubenswrapper[4839]: E0321 05:23:56.722786 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1\": container with ID starting with e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1 not found: ID does not exist" containerID="e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1" Mar 21 05:23:56 crc kubenswrapper[4839]: I0321 05:23:56.722842 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1"} err="failed to get container status \"e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1\": rpc error: code = NotFound desc = could not find container \"e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1\": container with ID starting with e8d7065951a6898c812e94884898191f01a58bc63b26600de006202fa8c929e1 not found: ID does not exist" Mar 21 05:23:57 crc kubenswrapper[4839]: I0321 05:23:57.952186 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/must-gather-mxjpl" event={"ID":"de78e0a8-6c32-44ae-8f44-443eb0f1dd25","Type":"ContainerStarted","Data":"0acccee08d8e21b640f974b3184f1317711fe544ebde8a6ac1eadbd5cddfd459"} Mar 21 05:23:57 crc kubenswrapper[4839]: I0321 05:23:57.953125 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/must-gather-mxjpl" event={"ID":"de78e0a8-6c32-44ae-8f44-443eb0f1dd25","Type":"ContainerStarted","Data":"4aacadbb7c340286a8bc5bb514479c70f999217550d1118742a4cf28f857f96a"} Mar 21 05:23:57 crc kubenswrapper[4839]: I0321 05:23:57.999705 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-72qrq/must-gather-mxjpl" podStartSLOduration=2.686641435 podStartE2EDuration="6.999681656s" podCreationTimestamp="2026-03-21 05:23:51 +0000 UTC" firstStartedPulling="2026-03-21 05:23:52.410342956 +0000 UTC m=+3636.738129632" lastFinishedPulling="2026-03-21 05:23:56.723383177 +0000 UTC m=+3641.051169853" observedRunningTime="2026-03-21 05:23:57.991157825 +0000 UTC m=+3642.318944531" watchObservedRunningTime="2026-03-21 05:23:57.999681656 +0000 UTC m=+3642.327468332" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.171445 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567844-qzhl5"] Mar 21 05:24:00 crc kubenswrapper[4839]: E0321 05:24:00.172251 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerName="extract-content" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.172270 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerName="extract-content" Mar 21 05:24:00 crc kubenswrapper[4839]: E0321 05:24:00.172287 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerName="extract-utilities" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.172294 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerName="extract-utilities" Mar 21 05:24:00 crc kubenswrapper[4839]: E0321 05:24:00.172328 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerName="registry-server" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.172335 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerName="registry-server" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.172488 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaf4c7b6-9115-45ff-b52d-3e232223ae62" containerName="registry-server" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.173251 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567844-qzhl5" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.176762 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.177123 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.178550 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.194459 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567844-qzhl5"] Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.330691 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65fhp\" (UniqueName: \"kubernetes.io/projected/ad5b8a95-e16d-42a8-9069-5294c8934559-kube-api-access-65fhp\") pod \"auto-csr-approver-29567844-qzhl5\" (UID: \"ad5b8a95-e16d-42a8-9069-5294c8934559\") " pod="openshift-infra/auto-csr-approver-29567844-qzhl5" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.433819 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65fhp\" (UniqueName: \"kubernetes.io/projected/ad5b8a95-e16d-42a8-9069-5294c8934559-kube-api-access-65fhp\") pod \"auto-csr-approver-29567844-qzhl5\" (UID: \"ad5b8a95-e16d-42a8-9069-5294c8934559\") " pod="openshift-infra/auto-csr-approver-29567844-qzhl5" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.467141 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65fhp\" (UniqueName: \"kubernetes.io/projected/ad5b8a95-e16d-42a8-9069-5294c8934559-kube-api-access-65fhp\") pod \"auto-csr-approver-29567844-qzhl5\" (UID: \"ad5b8a95-e16d-42a8-9069-5294c8934559\") " pod="openshift-infra/auto-csr-approver-29567844-qzhl5" Mar 21 05:24:00 crc kubenswrapper[4839]: I0321 05:24:00.499494 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567844-qzhl5" Mar 21 05:24:01 crc kubenswrapper[4839]: I0321 05:24:01.097809 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567844-qzhl5"] Mar 21 05:24:01 crc kubenswrapper[4839]: I0321 05:24:01.448486 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-72qrq/crc-debug-lb22p"] Mar 21 05:24:01 crc kubenswrapper[4839]: I0321 05:24:01.449859 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-lb22p" Mar 21 05:24:01 crc kubenswrapper[4839]: I0321 05:24:01.560383 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a565de2-9452-4c7a-85c5-8fe6f15f6859-host\") pod \"crc-debug-lb22p\" (UID: \"8a565de2-9452-4c7a-85c5-8fe6f15f6859\") " pod="openshift-must-gather-72qrq/crc-debug-lb22p" Mar 21 05:24:01 crc kubenswrapper[4839]: I0321 05:24:01.560526 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g9vn\" (UniqueName: \"kubernetes.io/projected/8a565de2-9452-4c7a-85c5-8fe6f15f6859-kube-api-access-4g9vn\") pod \"crc-debug-lb22p\" (UID: \"8a565de2-9452-4c7a-85c5-8fe6f15f6859\") " pod="openshift-must-gather-72qrq/crc-debug-lb22p" Mar 21 05:24:01 crc kubenswrapper[4839]: I0321 05:24:01.662403 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a565de2-9452-4c7a-85c5-8fe6f15f6859-host\") pod \"crc-debug-lb22p\" (UID: \"8a565de2-9452-4c7a-85c5-8fe6f15f6859\") " pod="openshift-must-gather-72qrq/crc-debug-lb22p" Mar 21 05:24:01 crc kubenswrapper[4839]: I0321 05:24:01.662537 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g9vn\" (UniqueName: \"kubernetes.io/projected/8a565de2-9452-4c7a-85c5-8fe6f15f6859-kube-api-access-4g9vn\") pod \"crc-debug-lb22p\" (UID: \"8a565de2-9452-4c7a-85c5-8fe6f15f6859\") " pod="openshift-must-gather-72qrq/crc-debug-lb22p" Mar 21 05:24:01 crc kubenswrapper[4839]: I0321 05:24:01.662543 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a565de2-9452-4c7a-85c5-8fe6f15f6859-host\") pod \"crc-debug-lb22p\" (UID: \"8a565de2-9452-4c7a-85c5-8fe6f15f6859\") " pod="openshift-must-gather-72qrq/crc-debug-lb22p" Mar 21 05:24:01 crc kubenswrapper[4839]: I0321 05:24:01.698764 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g9vn\" (UniqueName: \"kubernetes.io/projected/8a565de2-9452-4c7a-85c5-8fe6f15f6859-kube-api-access-4g9vn\") pod \"crc-debug-lb22p\" (UID: \"8a565de2-9452-4c7a-85c5-8fe6f15f6859\") " pod="openshift-must-gather-72qrq/crc-debug-lb22p" Mar 21 05:24:01 crc kubenswrapper[4839]: I0321 05:24:01.794156 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-lb22p" Mar 21 05:24:01 crc kubenswrapper[4839]: W0321 05:24:01.837160 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a565de2_9452_4c7a_85c5_8fe6f15f6859.slice/crio-e108ca9ad3bac0d26710fce2c914fcea5a0bce2060d74cc3faf00e429ecc5144 WatchSource:0}: Error finding container e108ca9ad3bac0d26710fce2c914fcea5a0bce2060d74cc3faf00e429ecc5144: Status 404 returned error can't find the container with id e108ca9ad3bac0d26710fce2c914fcea5a0bce2060d74cc3faf00e429ecc5144 Mar 21 05:24:02 crc kubenswrapper[4839]: I0321 05:24:02.003075 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567844-qzhl5" event={"ID":"ad5b8a95-e16d-42a8-9069-5294c8934559","Type":"ContainerStarted","Data":"3df04d0604b00287c03f656614f8f501c7dd27b6e7b919a6ab1a6ddbacba037a"} Mar 21 05:24:02 crc kubenswrapper[4839]: I0321 05:24:02.005146 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/crc-debug-lb22p" event={"ID":"8a565de2-9452-4c7a-85c5-8fe6f15f6859","Type":"ContainerStarted","Data":"e108ca9ad3bac0d26710fce2c914fcea5a0bce2060d74cc3faf00e429ecc5144"} Mar 21 05:24:04 crc kubenswrapper[4839]: I0321 05:24:04.024762 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567844-qzhl5" event={"ID":"ad5b8a95-e16d-42a8-9069-5294c8934559","Type":"ContainerStarted","Data":"cb7937f2ae576fec589579ad2dd17797c203b9a5a4641193da2cc618f8fd881c"} Mar 21 05:24:05 crc kubenswrapper[4839]: I0321 05:24:05.037603 4839 generic.go:334] "Generic (PLEG): container finished" podID="ad5b8a95-e16d-42a8-9069-5294c8934559" containerID="cb7937f2ae576fec589579ad2dd17797c203b9a5a4641193da2cc618f8fd881c" exitCode=0 Mar 21 05:24:05 crc kubenswrapper[4839]: I0321 05:24:05.037678 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567844-qzhl5" event={"ID":"ad5b8a95-e16d-42a8-9069-5294c8934559","Type":"ContainerDied","Data":"cb7937f2ae576fec589579ad2dd17797c203b9a5a4641193da2cc618f8fd881c"} Mar 21 05:24:06 crc kubenswrapper[4839]: I0321 05:24:06.620707 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567844-qzhl5" Mar 21 05:24:06 crc kubenswrapper[4839]: I0321 05:24:06.702530 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65fhp\" (UniqueName: \"kubernetes.io/projected/ad5b8a95-e16d-42a8-9069-5294c8934559-kube-api-access-65fhp\") pod \"ad5b8a95-e16d-42a8-9069-5294c8934559\" (UID: \"ad5b8a95-e16d-42a8-9069-5294c8934559\") " Mar 21 05:24:06 crc kubenswrapper[4839]: I0321 05:24:06.722500 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad5b8a95-e16d-42a8-9069-5294c8934559-kube-api-access-65fhp" (OuterVolumeSpecName: "kube-api-access-65fhp") pod "ad5b8a95-e16d-42a8-9069-5294c8934559" (UID: "ad5b8a95-e16d-42a8-9069-5294c8934559"). InnerVolumeSpecName "kube-api-access-65fhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:24:06 crc kubenswrapper[4839]: I0321 05:24:06.804979 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65fhp\" (UniqueName: \"kubernetes.io/projected/ad5b8a95-e16d-42a8-9069-5294c8934559-kube-api-access-65fhp\") on node \"crc\" DevicePath \"\"" Mar 21 05:24:07 crc kubenswrapper[4839]: I0321 05:24:07.061675 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567844-qzhl5" event={"ID":"ad5b8a95-e16d-42a8-9069-5294c8934559","Type":"ContainerDied","Data":"3df04d0604b00287c03f656614f8f501c7dd27b6e7b919a6ab1a6ddbacba037a"} Mar 21 05:24:07 crc kubenswrapper[4839]: I0321 05:24:07.061728 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3df04d0604b00287c03f656614f8f501c7dd27b6e7b919a6ab1a6ddbacba037a" Mar 21 05:24:07 crc kubenswrapper[4839]: I0321 05:24:07.061730 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567844-qzhl5" Mar 21 05:24:07 crc kubenswrapper[4839]: I0321 05:24:07.710442 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567838-z4kh5"] Mar 21 05:24:07 crc kubenswrapper[4839]: I0321 05:24:07.724697 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567838-z4kh5"] Mar 21 05:24:08 crc kubenswrapper[4839]: I0321 05:24:08.463638 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a" path="/var/lib/kubelet/pods/a41e2c3a-6a99-4d7f-9f7c-1baf73fe815a/volumes" Mar 21 05:24:17 crc kubenswrapper[4839]: I0321 05:24:17.472985 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/crc-debug-lb22p" event={"ID":"8a565de2-9452-4c7a-85c5-8fe6f15f6859","Type":"ContainerStarted","Data":"1992910c0bf58b0e82afab7177c5dce6191e1367cd3ed72f39dc82352bb794e1"} Mar 21 05:24:17 crc kubenswrapper[4839]: I0321 05:24:17.493784 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-72qrq/crc-debug-lb22p" podStartSLOduration=1.14990143 podStartE2EDuration="16.493757323s" podCreationTimestamp="2026-03-21 05:24:01 +0000 UTC" firstStartedPulling="2026-03-21 05:24:01.839411356 +0000 UTC m=+3646.167198032" lastFinishedPulling="2026-03-21 05:24:17.183267259 +0000 UTC m=+3661.511053925" observedRunningTime="2026-03-21 05:24:17.48689598 +0000 UTC m=+3661.814682656" watchObservedRunningTime="2026-03-21 05:24:17.493757323 +0000 UTC m=+3661.821544009" Mar 21 05:24:30 crc kubenswrapper[4839]: I0321 05:24:30.980165 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:24:30 crc kubenswrapper[4839]: I0321 05:24:30.981126 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:24:57 crc kubenswrapper[4839]: I0321 05:24:57.411355 4839 scope.go:117] "RemoveContainer" containerID="aa966754301c031bb6355b4136e2fe214f5819ce3ea77c126ebfb20a4377b523" Mar 21 05:25:00 crc kubenswrapper[4839]: I0321 05:25:00.930469 4839 generic.go:334] "Generic (PLEG): container finished" podID="8a565de2-9452-4c7a-85c5-8fe6f15f6859" containerID="1992910c0bf58b0e82afab7177c5dce6191e1367cd3ed72f39dc82352bb794e1" exitCode=0 Mar 21 05:25:00 crc kubenswrapper[4839]: I0321 05:25:00.930588 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/crc-debug-lb22p" event={"ID":"8a565de2-9452-4c7a-85c5-8fe6f15f6859","Type":"ContainerDied","Data":"1992910c0bf58b0e82afab7177c5dce6191e1367cd3ed72f39dc82352bb794e1"} Mar 21 05:25:00 crc kubenswrapper[4839]: I0321 05:25:00.981017 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:25:00 crc kubenswrapper[4839]: I0321 05:25:00.981076 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.054599 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-lb22p" Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.092237 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-72qrq/crc-debug-lb22p"] Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.106833 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-72qrq/crc-debug-lb22p"] Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.227270 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a565de2-9452-4c7a-85c5-8fe6f15f6859-host\") pod \"8a565de2-9452-4c7a-85c5-8fe6f15f6859\" (UID: \"8a565de2-9452-4c7a-85c5-8fe6f15f6859\") " Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.227373 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a565de2-9452-4c7a-85c5-8fe6f15f6859-host" (OuterVolumeSpecName: "host") pod "8a565de2-9452-4c7a-85c5-8fe6f15f6859" (UID: "8a565de2-9452-4c7a-85c5-8fe6f15f6859"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.227589 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g9vn\" (UniqueName: \"kubernetes.io/projected/8a565de2-9452-4c7a-85c5-8fe6f15f6859-kube-api-access-4g9vn\") pod \"8a565de2-9452-4c7a-85c5-8fe6f15f6859\" (UID: \"8a565de2-9452-4c7a-85c5-8fe6f15f6859\") " Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.228148 4839 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a565de2-9452-4c7a-85c5-8fe6f15f6859-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.240427 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a565de2-9452-4c7a-85c5-8fe6f15f6859-kube-api-access-4g9vn" (OuterVolumeSpecName: "kube-api-access-4g9vn") pod "8a565de2-9452-4c7a-85c5-8fe6f15f6859" (UID: "8a565de2-9452-4c7a-85c5-8fe6f15f6859"). InnerVolumeSpecName "kube-api-access-4g9vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.329661 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g9vn\" (UniqueName: \"kubernetes.io/projected/8a565de2-9452-4c7a-85c5-8fe6f15f6859-kube-api-access-4g9vn\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.465107 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a565de2-9452-4c7a-85c5-8fe6f15f6859" path="/var/lib/kubelet/pods/8a565de2-9452-4c7a-85c5-8fe6f15f6859/volumes" Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.948993 4839 scope.go:117] "RemoveContainer" containerID="1992910c0bf58b0e82afab7177c5dce6191e1367cd3ed72f39dc82352bb794e1" Mar 21 05:25:02 crc kubenswrapper[4839]: I0321 05:25:02.949136 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-lb22p" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.266837 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-72qrq/crc-debug-nkdhd"] Mar 21 05:25:03 crc kubenswrapper[4839]: E0321 05:25:03.267794 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a565de2-9452-4c7a-85c5-8fe6f15f6859" containerName="container-00" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.267823 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a565de2-9452-4c7a-85c5-8fe6f15f6859" containerName="container-00" Mar 21 05:25:03 crc kubenswrapper[4839]: E0321 05:25:03.267866 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5b8a95-e16d-42a8-9069-5294c8934559" containerName="oc" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.267874 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5b8a95-e16d-42a8-9069-5294c8934559" containerName="oc" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.268070 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad5b8a95-e16d-42a8-9069-5294c8934559" containerName="oc" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.268096 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a565de2-9452-4c7a-85c5-8fe6f15f6859" containerName="container-00" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.268883 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-nkdhd" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.452905 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84tzr\" (UniqueName: \"kubernetes.io/projected/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-kube-api-access-84tzr\") pod \"crc-debug-nkdhd\" (UID: \"c815b0ab-12b4-4c7e-a22c-5ae8680936c0\") " pod="openshift-must-gather-72qrq/crc-debug-nkdhd" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.453219 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-host\") pod \"crc-debug-nkdhd\" (UID: \"c815b0ab-12b4-4c7e-a22c-5ae8680936c0\") " pod="openshift-must-gather-72qrq/crc-debug-nkdhd" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.555977 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84tzr\" (UniqueName: \"kubernetes.io/projected/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-kube-api-access-84tzr\") pod \"crc-debug-nkdhd\" (UID: \"c815b0ab-12b4-4c7e-a22c-5ae8680936c0\") " pod="openshift-must-gather-72qrq/crc-debug-nkdhd" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.556219 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-host\") pod \"crc-debug-nkdhd\" (UID: \"c815b0ab-12b4-4c7e-a22c-5ae8680936c0\") " pod="openshift-must-gather-72qrq/crc-debug-nkdhd" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.556345 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-host\") pod \"crc-debug-nkdhd\" (UID: \"c815b0ab-12b4-4c7e-a22c-5ae8680936c0\") " pod="openshift-must-gather-72qrq/crc-debug-nkdhd" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.580772 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84tzr\" (UniqueName: \"kubernetes.io/projected/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-kube-api-access-84tzr\") pod \"crc-debug-nkdhd\" (UID: \"c815b0ab-12b4-4c7e-a22c-5ae8680936c0\") " pod="openshift-must-gather-72qrq/crc-debug-nkdhd" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.587846 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-nkdhd" Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.959650 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/crc-debug-nkdhd" event={"ID":"c815b0ab-12b4-4c7e-a22c-5ae8680936c0","Type":"ContainerStarted","Data":"e901d2e0584567d056a14a65a0dbf9adc3eea19c597864f96a32867a7a1b9a1d"} Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.959962 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/crc-debug-nkdhd" event={"ID":"c815b0ab-12b4-4c7e-a22c-5ae8680936c0","Type":"ContainerStarted","Data":"1ef3a429fab53a62499b69a9b777966ca288c36bcf5ff402c6d44788234a6533"} Mar 21 05:25:03 crc kubenswrapper[4839]: I0321 05:25:03.972694 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-72qrq/crc-debug-nkdhd" podStartSLOduration=0.972675305 podStartE2EDuration="972.675305ms" podCreationTimestamp="2026-03-21 05:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:25:03.972253863 +0000 UTC m=+3708.300040559" watchObservedRunningTime="2026-03-21 05:25:03.972675305 +0000 UTC m=+3708.300461981" Mar 21 05:25:04 crc kubenswrapper[4839]: I0321 05:25:04.971436 4839 generic.go:334] "Generic (PLEG): container finished" podID="c815b0ab-12b4-4c7e-a22c-5ae8680936c0" containerID="e901d2e0584567d056a14a65a0dbf9adc3eea19c597864f96a32867a7a1b9a1d" exitCode=0 Mar 21 05:25:04 crc kubenswrapper[4839]: I0321 05:25:04.971490 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/crc-debug-nkdhd" event={"ID":"c815b0ab-12b4-4c7e-a22c-5ae8680936c0","Type":"ContainerDied","Data":"e901d2e0584567d056a14a65a0dbf9adc3eea19c597864f96a32867a7a1b9a1d"} Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.072791 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-nkdhd" Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.106477 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-72qrq/crc-debug-nkdhd"] Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.115129 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-72qrq/crc-debug-nkdhd"] Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.204333 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-host\") pod \"c815b0ab-12b4-4c7e-a22c-5ae8680936c0\" (UID: \"c815b0ab-12b4-4c7e-a22c-5ae8680936c0\") " Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.204457 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84tzr\" (UniqueName: \"kubernetes.io/projected/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-kube-api-access-84tzr\") pod \"c815b0ab-12b4-4c7e-a22c-5ae8680936c0\" (UID: \"c815b0ab-12b4-4c7e-a22c-5ae8680936c0\") " Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.204561 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-host" (OuterVolumeSpecName: "host") pod "c815b0ab-12b4-4c7e-a22c-5ae8680936c0" (UID: "c815b0ab-12b4-4c7e-a22c-5ae8680936c0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.206035 4839 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.211925 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-kube-api-access-84tzr" (OuterVolumeSpecName: "kube-api-access-84tzr") pod "c815b0ab-12b4-4c7e-a22c-5ae8680936c0" (UID: "c815b0ab-12b4-4c7e-a22c-5ae8680936c0"). InnerVolumeSpecName "kube-api-access-84tzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.308587 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84tzr\" (UniqueName: \"kubernetes.io/projected/c815b0ab-12b4-4c7e-a22c-5ae8680936c0-kube-api-access-84tzr\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.464600 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c815b0ab-12b4-4c7e-a22c-5ae8680936c0" path="/var/lib/kubelet/pods/c815b0ab-12b4-4c7e-a22c-5ae8680936c0/volumes" Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.997869 4839 scope.go:117] "RemoveContainer" containerID="e901d2e0584567d056a14a65a0dbf9adc3eea19c597864f96a32867a7a1b9a1d" Mar 21 05:25:06 crc kubenswrapper[4839]: I0321 05:25:06.997985 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-nkdhd" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.285888 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-72qrq/crc-debug-xfsb6"] Mar 21 05:25:07 crc kubenswrapper[4839]: E0321 05:25:07.286742 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c815b0ab-12b4-4c7e-a22c-5ae8680936c0" containerName="container-00" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.286759 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c815b0ab-12b4-4c7e-a22c-5ae8680936c0" containerName="container-00" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.286966 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="c815b0ab-12b4-4c7e-a22c-5ae8680936c0" containerName="container-00" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.289334 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-xfsb6" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.430076 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nckp\" (UniqueName: \"kubernetes.io/projected/dfc60151-2f6c-4842-b0e9-4194fa1ff596-kube-api-access-9nckp\") pod \"crc-debug-xfsb6\" (UID: \"dfc60151-2f6c-4842-b0e9-4194fa1ff596\") " pod="openshift-must-gather-72qrq/crc-debug-xfsb6" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.430180 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfc60151-2f6c-4842-b0e9-4194fa1ff596-host\") pod \"crc-debug-xfsb6\" (UID: \"dfc60151-2f6c-4842-b0e9-4194fa1ff596\") " pod="openshift-must-gather-72qrq/crc-debug-xfsb6" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.531384 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfc60151-2f6c-4842-b0e9-4194fa1ff596-host\") pod \"crc-debug-xfsb6\" (UID: \"dfc60151-2f6c-4842-b0e9-4194fa1ff596\") " pod="openshift-must-gather-72qrq/crc-debug-xfsb6" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.531512 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nckp\" (UniqueName: \"kubernetes.io/projected/dfc60151-2f6c-4842-b0e9-4194fa1ff596-kube-api-access-9nckp\") pod \"crc-debug-xfsb6\" (UID: \"dfc60151-2f6c-4842-b0e9-4194fa1ff596\") " pod="openshift-must-gather-72qrq/crc-debug-xfsb6" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.531614 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfc60151-2f6c-4842-b0e9-4194fa1ff596-host\") pod \"crc-debug-xfsb6\" (UID: \"dfc60151-2f6c-4842-b0e9-4194fa1ff596\") " pod="openshift-must-gather-72qrq/crc-debug-xfsb6" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.551603 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nckp\" (UniqueName: \"kubernetes.io/projected/dfc60151-2f6c-4842-b0e9-4194fa1ff596-kube-api-access-9nckp\") pod \"crc-debug-xfsb6\" (UID: \"dfc60151-2f6c-4842-b0e9-4194fa1ff596\") " pod="openshift-must-gather-72qrq/crc-debug-xfsb6" Mar 21 05:25:07 crc kubenswrapper[4839]: I0321 05:25:07.608272 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-xfsb6" Mar 21 05:25:08 crc kubenswrapper[4839]: I0321 05:25:08.009261 4839 generic.go:334] "Generic (PLEG): container finished" podID="dfc60151-2f6c-4842-b0e9-4194fa1ff596" containerID="42273614d5636fbbe841ce75b9205756068fc6541255df4d8420f7f3fe4fd250" exitCode=0 Mar 21 05:25:08 crc kubenswrapper[4839]: I0321 05:25:08.009346 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/crc-debug-xfsb6" event={"ID":"dfc60151-2f6c-4842-b0e9-4194fa1ff596","Type":"ContainerDied","Data":"42273614d5636fbbe841ce75b9205756068fc6541255df4d8420f7f3fe4fd250"} Mar 21 05:25:08 crc kubenswrapper[4839]: I0321 05:25:08.009798 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/crc-debug-xfsb6" event={"ID":"dfc60151-2f6c-4842-b0e9-4194fa1ff596","Type":"ContainerStarted","Data":"d9e2cc211386c3fc3bc6977bd20cd9c80e6183dffa782cd8c68d6688c551308e"} Mar 21 05:25:08 crc kubenswrapper[4839]: I0321 05:25:08.050453 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-72qrq/crc-debug-xfsb6"] Mar 21 05:25:08 crc kubenswrapper[4839]: I0321 05:25:08.059850 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-72qrq/crc-debug-xfsb6"] Mar 21 05:25:09 crc kubenswrapper[4839]: I0321 05:25:09.132801 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-xfsb6" Mar 21 05:25:09 crc kubenswrapper[4839]: I0321 05:25:09.261038 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nckp\" (UniqueName: \"kubernetes.io/projected/dfc60151-2f6c-4842-b0e9-4194fa1ff596-kube-api-access-9nckp\") pod \"dfc60151-2f6c-4842-b0e9-4194fa1ff596\" (UID: \"dfc60151-2f6c-4842-b0e9-4194fa1ff596\") " Mar 21 05:25:09 crc kubenswrapper[4839]: I0321 05:25:09.262053 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfc60151-2f6c-4842-b0e9-4194fa1ff596-host\") pod \"dfc60151-2f6c-4842-b0e9-4194fa1ff596\" (UID: \"dfc60151-2f6c-4842-b0e9-4194fa1ff596\") " Mar 21 05:25:09 crc kubenswrapper[4839]: I0321 05:25:09.263543 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dfc60151-2f6c-4842-b0e9-4194fa1ff596-host" (OuterVolumeSpecName: "host") pod "dfc60151-2f6c-4842-b0e9-4194fa1ff596" (UID: "dfc60151-2f6c-4842-b0e9-4194fa1ff596"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:25:09 crc kubenswrapper[4839]: I0321 05:25:09.269897 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfc60151-2f6c-4842-b0e9-4194fa1ff596-kube-api-access-9nckp" (OuterVolumeSpecName: "kube-api-access-9nckp") pod "dfc60151-2f6c-4842-b0e9-4194fa1ff596" (UID: "dfc60151-2f6c-4842-b0e9-4194fa1ff596"). InnerVolumeSpecName "kube-api-access-9nckp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:25:09 crc kubenswrapper[4839]: I0321 05:25:09.365917 4839 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dfc60151-2f6c-4842-b0e9-4194fa1ff596-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:09 crc kubenswrapper[4839]: I0321 05:25:09.365967 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nckp\" (UniqueName: \"kubernetes.io/projected/dfc60151-2f6c-4842-b0e9-4194fa1ff596-kube-api-access-9nckp\") on node \"crc\" DevicePath \"\"" Mar 21 05:25:10 crc kubenswrapper[4839]: I0321 05:25:10.027883 4839 scope.go:117] "RemoveContainer" containerID="42273614d5636fbbe841ce75b9205756068fc6541255df4d8420f7f3fe4fd250" Mar 21 05:25:10 crc kubenswrapper[4839]: I0321 05:25:10.027925 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/crc-debug-xfsb6" Mar 21 05:25:10 crc kubenswrapper[4839]: I0321 05:25:10.465692 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfc60151-2f6c-4842-b0e9-4194fa1ff596" path="/var/lib/kubelet/pods/dfc60151-2f6c-4842-b0e9-4194fa1ff596/volumes" Mar 21 05:25:25 crc kubenswrapper[4839]: I0321 05:25:25.826319 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-d9cf4c794-jb7lf_37ba14c5-dfc7-4268-86c9-c0efe37fe6c9/barbican-api/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.018222 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-d9cf4c794-jb7lf_37ba14c5-dfc7-4268-86c9-c0efe37fe6c9/barbican-api-log/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.033804 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7b946d96f4-chv76_e6e03301-fb6e-467b-b19d-21b5c475d35c/barbican-keystone-listener/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.100937 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7b946d96f4-chv76_e6e03301-fb6e-467b-b19d-21b5c475d35c/barbican-keystone-listener-log/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.250432 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-db77b8b5f-grbp8_3563c0f9-9e82-4798-bae3-b3836a6b5866/barbican-worker/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.293087 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-db77b8b5f-grbp8_3563c0f9-9e82-4798-bae3-b3836a6b5866/barbican-worker-log/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.504447 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d1041d12-2cae-4009-a3f3-9df6e219d03b/ceilometer-central-agent/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.632537 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d1041d12-2cae-4009-a3f3-9df6e219d03b/ceilometer-notification-agent/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.633182 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz_a1d76458-d587-4960-9bcc-7e3d3122b44d/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.686240 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d1041d12-2cae-4009-a3f3-9df6e219d03b/proxy-httpd/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.763409 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d1041d12-2cae-4009-a3f3-9df6e219d03b/sg-core/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.901749 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5162af3c-3b00-4643-afd9-680f6e2f5c03/cinder-api/0.log" Mar 21 05:25:26 crc kubenswrapper[4839]: I0321 05:25:26.907055 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5162af3c-3b00-4643-afd9-680f6e2f5c03/cinder-api-log/0.log" Mar 21 05:25:27 crc kubenswrapper[4839]: I0321 05:25:27.060124 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_77964653-d242-4258-b06e-c9cd0fb64d84/cinder-scheduler/0.log" Mar 21 05:25:27 crc kubenswrapper[4839]: I0321 05:25:27.144394 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_77964653-d242-4258-b06e-c9cd0fb64d84/probe/0.log" Mar 21 05:25:27 crc kubenswrapper[4839]: I0321 05:25:27.258239 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx_a58d82e4-2de9-4680-a08c-6eeb775ed08a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:27 crc kubenswrapper[4839]: I0321 05:25:27.466437 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-n4nl2_a31699b4-0a8f-42c8-b7f4-319ef1d5423a/init/0.log" Mar 21 05:25:27 crc kubenswrapper[4839]: I0321 05:25:27.490351 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-qkclf_ab9d4433-fe0e-471b-84f8-568b31920ed3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:27 crc kubenswrapper[4839]: I0321 05:25:27.704367 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-n4nl2_a31699b4-0a8f-42c8-b7f4-319ef1d5423a/init/0.log" Mar 21 05:25:27 crc kubenswrapper[4839]: I0321 05:25:27.755550 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-n4nl2_a31699b4-0a8f-42c8-b7f4-319ef1d5423a/dnsmasq-dns/0.log" Mar 21 05:25:27 crc kubenswrapper[4839]: I0321 05:25:27.832247 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt_7f875f01-020a-4cd6-950a-4dbb6ccb344e/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:27 crc kubenswrapper[4839]: I0321 05:25:27.987113 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3e3e15ec-7425-4e0a-99a8-db3bb1cd486c/glance-httpd/0.log" Mar 21 05:25:28 crc kubenswrapper[4839]: I0321 05:25:28.017013 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3e3e15ec-7425-4e0a-99a8-db3bb1cd486c/glance-log/0.log" Mar 21 05:25:28 crc kubenswrapper[4839]: I0321 05:25:28.157561 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c7aa4192-53bb-412e-b25e-1fe47c59fa75/glance-log/0.log" Mar 21 05:25:28 crc kubenswrapper[4839]: I0321 05:25:28.194252 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c7aa4192-53bb-412e-b25e-1fe47c59fa75/glance-httpd/0.log" Mar 21 05:25:28 crc kubenswrapper[4839]: I0321 05:25:28.398126 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-9c97f4dbd-k2scs_579308eb-854d-4160-ad35-8677f2d0e634/horizon/0.log" Mar 21 05:25:28 crc kubenswrapper[4839]: I0321 05:25:28.664221 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7_268d87b5-57ec-49ff-be62-fe59e6b4b819/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:28 crc kubenswrapper[4839]: I0321 05:25:28.880538 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-9c97f4dbd-k2scs_579308eb-854d-4160-ad35-8677f2d0e634/horizon-log/0.log" Mar 21 05:25:29 crc kubenswrapper[4839]: I0321 05:25:29.156900 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-xdvx2_7538d496-3768-42b7-9f2e-70e1b44a9d6b/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:29 crc kubenswrapper[4839]: I0321 05:25:29.175698 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cb996784d-fvhvp_6a3fcdf0-3099-467b-928b-89a4876130fe/keystone-api/0.log" Mar 21 05:25:29 crc kubenswrapper[4839]: I0321 05:25:29.208781 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29567821-rmctn_666be2f4-0416-4086-94d3-c48c82f380b2/keystone-cron/0.log" Mar 21 05:25:29 crc kubenswrapper[4839]: I0321 05:25:29.377806 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_1626316f-b029-4424-b783-25eeb2790eb2/kube-state-metrics/0.log" Mar 21 05:25:29 crc kubenswrapper[4839]: I0321 05:25:29.924769 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-w48j6_2d056acb-0183-4157-a830-fff4cd1dcacf/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:30 crc kubenswrapper[4839]: I0321 05:25:30.037174 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-748dbf85fc-jslwv_cd21ac8b-d3c0-4f0c-9205-d60d55425d8a/neutron-httpd/0.log" Mar 21 05:25:30 crc kubenswrapper[4839]: I0321 05:25:30.058811 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-748dbf85fc-jslwv_cd21ac8b-d3c0-4f0c-9205-d60d55425d8a/neutron-api/0.log" Mar 21 05:25:30 crc kubenswrapper[4839]: I0321 05:25:30.437148 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d_ceef8f42-5d77-44c1-ac39-edf0080f68e0/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:30 crc kubenswrapper[4839]: I0321 05:25:30.930682 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_152d0351-12d2-4cf1-ad49-fd943b223442/nova-cell0-conductor-conductor/0.log" Mar 21 05:25:30 crc kubenswrapper[4839]: I0321 05:25:30.979890 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:25:30 crc kubenswrapper[4839]: I0321 05:25:30.979965 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:25:30 crc kubenswrapper[4839]: I0321 05:25:30.980022 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 05:25:30 crc kubenswrapper[4839]: I0321 05:25:30.981479 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"26d5ad8d8c206d8ada93506f3a162dccbd9846e40dd3da26db34bab6bbf70437"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:25:30 crc kubenswrapper[4839]: I0321 05:25:30.981743 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://26d5ad8d8c206d8ada93506f3a162dccbd9846e40dd3da26db34bab6bbf70437" gracePeriod=600 Mar 21 05:25:31 crc kubenswrapper[4839]: I0321 05:25:31.009711 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_627bf6a3-cf5d-42e1-9250-ba6684bb2cfc/nova-api-log/0.log" Mar 21 05:25:31 crc kubenswrapper[4839]: I0321 05:25:31.246021 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="26d5ad8d8c206d8ada93506f3a162dccbd9846e40dd3da26db34bab6bbf70437" exitCode=0 Mar 21 05:25:31 crc kubenswrapper[4839]: I0321 05:25:31.246105 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"26d5ad8d8c206d8ada93506f3a162dccbd9846e40dd3da26db34bab6bbf70437"} Mar 21 05:25:31 crc kubenswrapper[4839]: I0321 05:25:31.246219 4839 scope.go:117] "RemoveContainer" containerID="ae0086f9eacefc01aae5e4f5f99607f42c8a56a9eb17fd3e93fb691fd9a335b2" Mar 21 05:25:31 crc kubenswrapper[4839]: I0321 05:25:31.258065 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_3194b187-fe06-4eed-b725-995cef2b05a0/nova-cell1-conductor-conductor/0.log" Mar 21 05:25:31 crc kubenswrapper[4839]: I0321 05:25:31.354952 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_627bf6a3-cf5d-42e1-9250-ba6684bb2cfc/nova-api-api/0.log" Mar 21 05:25:31 crc kubenswrapper[4839]: I0321 05:25:31.369509 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2/nova-cell1-novncproxy-novncproxy/0.log" Mar 21 05:25:31 crc kubenswrapper[4839]: I0321 05:25:31.770235 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0aafbc7f-e890-4a32-8531-f148aeea18e6/nova-metadata-log/0.log" Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.168069 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0aafbc7f-e890-4a32-8531-f148aeea18e6/nova-metadata-metadata/0.log" Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.200693 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_bbecccff-0ecc-44ff-a57b-f7289b8bcf5a/nova-scheduler-scheduler/0.log" Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.250388 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-hf42f_3f8728ca-30ff-41a9-8a48-e3bb7911bcc7/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.259560 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d"} Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.306126 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_30d22e92-45bd-4d1e-954e-3ade801245d4/mysql-bootstrap/0.log" Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.549718 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_30d22e92-45bd-4d1e-954e-3ade801245d4/galera/0.log" Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.570236 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_30d22e92-45bd-4d1e-954e-3ade801245d4/mysql-bootstrap/0.log" Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.588212 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4f1edf0d-f220-4815-aeb6-e4507576247a/mysql-bootstrap/0.log" Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.869678 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_52b9f7e1-d86c-457e-9391-eee855a9f7a7/openstackclient/0.log" Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.887011 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4f1edf0d-f220-4815-aeb6-e4507576247a/galera/0.log" Mar 21 05:25:32 crc kubenswrapper[4839]: I0321 05:25:32.910020 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4f1edf0d-f220-4815-aeb6-e4507576247a/mysql-bootstrap/0.log" Mar 21 05:25:33 crc kubenswrapper[4839]: I0321 05:25:33.315835 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hrww8_3d74e911-e100-4e79-89be-202e06bb4d30/ovsdb-server-init/0.log" Mar 21 05:25:33 crc kubenswrapper[4839]: I0321 05:25:33.327884 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-mx5tf_64d13111-845e-4c61-a4ce-483ddfb799b7/openstack-network-exporter/0.log" Mar 21 05:25:33 crc kubenswrapper[4839]: I0321 05:25:33.603096 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hrww8_3d74e911-e100-4e79-89be-202e06bb4d30/ovsdb-server/0.log" Mar 21 05:25:33 crc kubenswrapper[4839]: I0321 05:25:33.604760 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hrww8_3d74e911-e100-4e79-89be-202e06bb4d30/ovsdb-server-init/0.log" Mar 21 05:25:33 crc kubenswrapper[4839]: I0321 05:25:33.655991 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hrww8_3d74e911-e100-4e79-89be-202e06bb4d30/ovs-vswitchd/0.log" Mar 21 05:25:33 crc kubenswrapper[4839]: I0321 05:25:33.802057 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-qt5s4_b31b64cb-0266-4b8a-9fcb-ae5e36c8309a/ovn-controller/0.log" Mar 21 05:25:34 crc kubenswrapper[4839]: I0321 05:25:34.000407 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-v4wqq_7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:34 crc kubenswrapper[4839]: I0321 05:25:34.055006 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_dbcaa531-3e09-48c7-8535-76f3e1f5c303/openstack-network-exporter/0.log" Mar 21 05:25:34 crc kubenswrapper[4839]: I0321 05:25:34.088590 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_dbcaa531-3e09-48c7-8535-76f3e1f5c303/ovn-northd/0.log" Mar 21 05:25:34 crc kubenswrapper[4839]: I0321 05:25:34.200424 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4a7a1028-3deb-4033-890c-db0861c6a9a2/openstack-network-exporter/0.log" Mar 21 05:25:34 crc kubenswrapper[4839]: I0321 05:25:34.343796 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4a7a1028-3deb-4033-890c-db0861c6a9a2/ovsdbserver-nb/0.log" Mar 21 05:25:34 crc kubenswrapper[4839]: I0321 05:25:34.390618 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8c2e5ef4-e4c0-4278-897e-ce5d00b4079d/openstack-network-exporter/0.log" Mar 21 05:25:34 crc kubenswrapper[4839]: I0321 05:25:34.527907 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8c2e5ef4-e4c0-4278-897e-ce5d00b4079d/ovsdbserver-sb/0.log" Mar 21 05:25:34 crc kubenswrapper[4839]: I0321 05:25:34.674641 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-75bd8b89b4-djjlh_bf5a44f8-8eb1-4953-b611-a02576e414ea/placement-log/0.log" Mar 21 05:25:34 crc kubenswrapper[4839]: I0321 05:25:34.688907 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-75bd8b89b4-djjlh_bf5a44f8-8eb1-4953-b611-a02576e414ea/placement-api/0.log" Mar 21 05:25:34 crc kubenswrapper[4839]: I0321 05:25:34.838730 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fa82c4a0-2b0e-4e22-9e91-7fc899122414/setup-container/0.log" Mar 21 05:25:35 crc kubenswrapper[4839]: I0321 05:25:35.094750 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fa82c4a0-2b0e-4e22-9e91-7fc899122414/rabbitmq/0.log" Mar 21 05:25:35 crc kubenswrapper[4839]: I0321 05:25:35.096048 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fa82c4a0-2b0e-4e22-9e91-7fc899122414/setup-container/0.log" Mar 21 05:25:35 crc kubenswrapper[4839]: I0321 05:25:35.102982 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bfff67da-8ea4-4798-9b8d-58a3abac4347/setup-container/0.log" Mar 21 05:25:35 crc kubenswrapper[4839]: I0321 05:25:35.328050 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bfff67da-8ea4-4798-9b8d-58a3abac4347/rabbitmq/0.log" Mar 21 05:25:35 crc kubenswrapper[4839]: I0321 05:25:35.388842 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bfff67da-8ea4-4798-9b8d-58a3abac4347/setup-container/0.log" Mar 21 05:25:35 crc kubenswrapper[4839]: I0321 05:25:35.402585 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r_66c3e343-3306-455d-89d7-db17c1bd53ed/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:35 crc kubenswrapper[4839]: I0321 05:25:35.571252 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-pgfnn_a6dd2bff-543f-4ebb-b908-3e528f322548/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:35 crc kubenswrapper[4839]: I0321 05:25:35.710206 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq_acb0bb61-c53a-4171-bca5-4a3141d6904a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:35 crc kubenswrapper[4839]: I0321 05:25:35.888480 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-55fzl_26adbd7b-7994-4bea-9f94-338881339833/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:35 crc kubenswrapper[4839]: I0321 05:25:35.911287 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-chfcw_39dbacec-c845-4f19-92a9-c0e63fba203c/ssh-known-hosts-edpm-deployment/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.210384 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-b66c6bfff-76gfx_1af5fd5b-8392-4e55-b3fb-fdc9285dd135/proxy-httpd/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.214798 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-b66c6bfff-76gfx_1af5fd5b-8392-4e55-b3fb-fdc9285dd135/proxy-server/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.327257 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-kkvzq_5484abbf-53f2-445a-b6fe-0996eba95345/swift-ring-rebalance/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.423546 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/account-reaper/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.444955 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/account-auditor/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.559187 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/account-replicator/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.675304 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/account-server/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.696343 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/container-auditor/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.768187 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/container-replicator/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.828422 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/container-server/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.925265 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/container-updater/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.935412 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/object-auditor/0.log" Mar 21 05:25:36 crc kubenswrapper[4839]: I0321 05:25:36.949799 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/object-expirer/0.log" Mar 21 05:25:37 crc kubenswrapper[4839]: I0321 05:25:37.087334 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/object-replicator/0.log" Mar 21 05:25:37 crc kubenswrapper[4839]: I0321 05:25:37.165697 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/object-server/0.log" Mar 21 05:25:37 crc kubenswrapper[4839]: I0321 05:25:37.167628 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/object-updater/0.log" Mar 21 05:25:37 crc kubenswrapper[4839]: I0321 05:25:37.173716 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/rsync/0.log" Mar 21 05:25:37 crc kubenswrapper[4839]: I0321 05:25:37.328914 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/swift-recon-cron/0.log" Mar 21 05:25:37 crc kubenswrapper[4839]: I0321 05:25:37.549548 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3/tempest-tests-tempest-tests-runner/0.log" Mar 21 05:25:37 crc kubenswrapper[4839]: I0321 05:25:37.690506 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8/test-operator-logs-container/0.log" Mar 21 05:25:37 crc kubenswrapper[4839]: I0321 05:25:37.865204 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq_4f49b501-bec5-4fe1-89d7-ff3c217ba580/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:37 crc kubenswrapper[4839]: I0321 05:25:37.886730 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h_f9d60b3b-b1b4-4d98-9da2-e152ac410c81/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:25:46 crc kubenswrapper[4839]: I0321 05:25:46.986845 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3c49bdbb-0c05-4dea-8de8-61ca09b7e84c/memcached/0.log" Mar 21 05:25:57 crc kubenswrapper[4839]: I0321 05:25:57.483166 4839 scope.go:117] "RemoveContainer" containerID="0f0fb05b1ad7b9c8c86908bf5eed059955dab4bf2710991db435f65e1f3837bc" Mar 21 05:25:57 crc kubenswrapper[4839]: I0321 05:25:57.506784 4839 scope.go:117] "RemoveContainer" containerID="0fb7e2104f45f80289c410cb7ed43ea4ebad69cb7a12b53a2a2c205806ea1801" Mar 21 05:25:57 crc kubenswrapper[4839]: I0321 05:25:57.539918 4839 scope.go:117] "RemoveContainer" containerID="36e58fa15812a6897b50d6db0ed7a274a81303b9dec23efb000319a8d9a33254" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.139097 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567846-8dwrk"] Mar 21 05:26:00 crc kubenswrapper[4839]: E0321 05:26:00.139860 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfc60151-2f6c-4842-b0e9-4194fa1ff596" containerName="container-00" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.139876 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc60151-2f6c-4842-b0e9-4194fa1ff596" containerName="container-00" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.140134 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfc60151-2f6c-4842-b0e9-4194fa1ff596" containerName="container-00" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.140842 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567846-8dwrk" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.143437 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.143755 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.146173 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.150283 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567846-8dwrk"] Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.238145 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk44c\" (UniqueName: \"kubernetes.io/projected/274043bb-38cf-435f-9cb1-01d194d34325-kube-api-access-gk44c\") pod \"auto-csr-approver-29567846-8dwrk\" (UID: \"274043bb-38cf-435f-9cb1-01d194d34325\") " pod="openshift-infra/auto-csr-approver-29567846-8dwrk" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.339402 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk44c\" (UniqueName: \"kubernetes.io/projected/274043bb-38cf-435f-9cb1-01d194d34325-kube-api-access-gk44c\") pod \"auto-csr-approver-29567846-8dwrk\" (UID: \"274043bb-38cf-435f-9cb1-01d194d34325\") " pod="openshift-infra/auto-csr-approver-29567846-8dwrk" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.366807 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk44c\" (UniqueName: \"kubernetes.io/projected/274043bb-38cf-435f-9cb1-01d194d34325-kube-api-access-gk44c\") pod \"auto-csr-approver-29567846-8dwrk\" (UID: \"274043bb-38cf-435f-9cb1-01d194d34325\") " pod="openshift-infra/auto-csr-approver-29567846-8dwrk" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.461108 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567846-8dwrk" Mar 21 05:26:00 crc kubenswrapper[4839]: I0321 05:26:00.933369 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567846-8dwrk"] Mar 21 05:26:01 crc kubenswrapper[4839]: I0321 05:26:01.558747 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567846-8dwrk" event={"ID":"274043bb-38cf-435f-9cb1-01d194d34325","Type":"ContainerStarted","Data":"8641a02b7c87efdb8a0624f1a6768117dfedc165ea974aa527a2fee8b60e842e"} Mar 21 05:26:02 crc kubenswrapper[4839]: I0321 05:26:02.223002 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-2mkmz_0c51ffa0-2285-4f7e-af09-0cafba139934/manager/0.log" Mar 21 05:26:02 crc kubenswrapper[4839]: I0321 05:26:02.476775 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-9s4vt_ee9d64a7-0d03-4cb0-a266-47b26f9957b5/manager/0.log" Mar 21 05:26:02 crc kubenswrapper[4839]: E0321 05:26:02.569071 4839 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod274043bb_38cf_435f_9cb1_01d194d34325.slice/crio-conmon-3e713d4f3a2eccb8fba4adfa096046056f6cf5d095f6cdc7fc919eb1fb945456.scope\": RecentStats: unable to find data in memory cache]" Mar 21 05:26:02 crc kubenswrapper[4839]: I0321 05:26:02.570033 4839 generic.go:334] "Generic (PLEG): container finished" podID="274043bb-38cf-435f-9cb1-01d194d34325" containerID="3e713d4f3a2eccb8fba4adfa096046056f6cf5d095f6cdc7fc919eb1fb945456" exitCode=0 Mar 21 05:26:02 crc kubenswrapper[4839]: I0321 05:26:02.570069 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567846-8dwrk" event={"ID":"274043bb-38cf-435f-9cb1-01d194d34325","Type":"ContainerDied","Data":"3e713d4f3a2eccb8fba4adfa096046056f6cf5d095f6cdc7fc919eb1fb945456"} Mar 21 05:26:02 crc kubenswrapper[4839]: I0321 05:26:02.704150 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/util/0.log" Mar 21 05:26:02 crc kubenswrapper[4839]: I0321 05:26:02.919109 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/pull/0.log" Mar 21 05:26:02 crc kubenswrapper[4839]: I0321 05:26:02.932033 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/util/0.log" Mar 21 05:26:02 crc kubenswrapper[4839]: I0321 05:26:02.955543 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/pull/0.log" Mar 21 05:26:03 crc kubenswrapper[4839]: I0321 05:26:03.190573 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/pull/0.log" Mar 21 05:26:03 crc kubenswrapper[4839]: I0321 05:26:03.247475 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/util/0.log" Mar 21 05:26:03 crc kubenswrapper[4839]: I0321 05:26:03.247884 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/extract/0.log" Mar 21 05:26:03 crc kubenswrapper[4839]: I0321 05:26:03.254228 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-dncxc_05f30a88-e899-4727-9440-981d010a1342/manager/0.log" Mar 21 05:26:03 crc kubenswrapper[4839]: I0321 05:26:03.474254 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-2n27d_fd731e7e-440b-4e77-a778-08a4a62e0c9f/manager/0.log" Mar 21 05:26:03 crc kubenswrapper[4839]: I0321 05:26:03.511823 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-6s6q7_d3dc722f-f66c-46a0-9b1a-ae1b9c4de060/manager/0.log" Mar 21 05:26:03 crc kubenswrapper[4839]: I0321 05:26:03.727249 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-d7h7r_acb1d7ac-b3f9-4564-8346-344ffb5c3964/manager/0.log" Mar 21 05:26:03 crc kubenswrapper[4839]: I0321 05:26:03.997291 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-8sg4d_ccec0d11-294b-43a2-be2e-fcef8a6818c6/manager/0.log" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.009845 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567846-8dwrk" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.084403 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-bsdjs_ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b/manager/0.log" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.119096 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk44c\" (UniqueName: \"kubernetes.io/projected/274043bb-38cf-435f-9cb1-01d194d34325-kube-api-access-gk44c\") pod \"274043bb-38cf-435f-9cb1-01d194d34325\" (UID: \"274043bb-38cf-435f-9cb1-01d194d34325\") " Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.125984 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/274043bb-38cf-435f-9cb1-01d194d34325-kube-api-access-gk44c" (OuterVolumeSpecName: "kube-api-access-gk44c") pod "274043bb-38cf-435f-9cb1-01d194d34325" (UID: "274043bb-38cf-435f-9cb1-01d194d34325"). InnerVolumeSpecName "kube-api-access-gk44c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.221096 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk44c\" (UniqueName: \"kubernetes.io/projected/274043bb-38cf-435f-9cb1-01d194d34325-kube-api-access-gk44c\") on node \"crc\" DevicePath \"\"" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.282336 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-gzh8j_6074766c-0ecd-4051-a676-dcc21b24184f/manager/0.log" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.288712 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-k4lg5_7a7bf7a3-acea-4059-8a89-db576f3588d1/manager/0.log" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.533875 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-sp4j4_2162bafb-7e49-435c-9591-d8b725f10336/manager/0.log" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.544510 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-94vpf_70702cd5-6815-4a01-98a4-2f4dfaeef839/manager/0.log" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.595960 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567846-8dwrk" event={"ID":"274043bb-38cf-435f-9cb1-01d194d34325","Type":"ContainerDied","Data":"8641a02b7c87efdb8a0624f1a6768117dfedc165ea974aa527a2fee8b60e842e"} Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.596017 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8641a02b7c87efdb8a0624f1a6768117dfedc165ea974aa527a2fee8b60e842e" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.596101 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567846-8dwrk" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.776834 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-wjw9j_6914418f-3639-4ebc-a58d-d8b478cbf6b4/manager/0.log" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.778351 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-6p4mn_faac458b-73d9-4fb8-9f1c-50f7521088b0/manager/0.log" Mar 21 05:26:04 crc kubenswrapper[4839]: I0321 05:26:04.965974 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-8gc22_859b11bc-e9fb-40a2-a053-66a07337965c/manager/0.log" Mar 21 05:26:05 crc kubenswrapper[4839]: I0321 05:26:05.105901 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567840-rbk96"] Mar 21 05:26:05 crc kubenswrapper[4839]: I0321 05:26:05.140267 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567840-rbk96"] Mar 21 05:26:05 crc kubenswrapper[4839]: I0321 05:26:05.155292 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-948579bb7-j6fx6_b27308cc-b2b7-4bf0-a3ca-55ccdfa47f59/operator/0.log" Mar 21 05:26:05 crc kubenswrapper[4839]: I0321 05:26:05.262240 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-lj8h4_6ff65f56-ff89-43c6-b087-6d3c3b72d2ef/registry-server/0.log" Mar 21 05:26:05 crc kubenswrapper[4839]: I0321 05:26:05.526624 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-qt58c_379b40a1-e3f5-448b-b668-0f168457e5d0/manager/0.log" Mar 21 05:26:05 crc kubenswrapper[4839]: I0321 05:26:05.696415 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-x75fd_361c2d7b-9a75-41fd-953d-4b1bd64ca6df/manager/0.log" Mar 21 05:26:05 crc kubenswrapper[4839]: I0321 05:26:05.810491 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-lzbtt_c8584ecb-dc92-4cec-9178-3017f09095da/operator/0.log" Mar 21 05:26:05 crc kubenswrapper[4839]: I0321 05:26:05.988429 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-xt7xt_2045f5d2-c67e-47cd-b16d-3c69d449f099/manager/0.log" Mar 21 05:26:06 crc kubenswrapper[4839]: I0321 05:26:06.258850 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-btkvt_d3ea9c2e-11a4-492e-9e84-8294e81ce775/manager/0.log" Mar 21 05:26:06 crc kubenswrapper[4839]: I0321 05:26:06.324390 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-7f4qh_5eeb53bd-3988-458f-baa5-d265e0178aea/manager/0.log" Mar 21 05:26:06 crc kubenswrapper[4839]: I0321 05:26:06.488634 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b89d49dc-a7f5-4a24-98c5-818fe0e99ded" path="/var/lib/kubelet/pods/b89d49dc-a7f5-4a24-98c5-818fe0e99ded/volumes" Mar 21 05:26:06 crc kubenswrapper[4839]: I0321 05:26:06.489103 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5ccd4855ff-jx6pn_06f9e67e-8978-46a1-9dc8-c511197241e2/manager/0.log" Mar 21 05:26:06 crc kubenswrapper[4839]: I0321 05:26:06.573609 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-hh27s_1d32b541-7b80-492b-adac-e51d5090b668/manager/0.log" Mar 21 05:26:24 crc kubenswrapper[4839]: I0321 05:26:24.980419 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-whlp9_40014780-8cb8-47fa-8b2c-c4fb7d04a85c/control-plane-machine-set-operator/0.log" Mar 21 05:26:25 crc kubenswrapper[4839]: I0321 05:26:25.183856 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nmj8p_c4d393d7-42d7-4b7d-a3cd-f7e325b97c54/kube-rbac-proxy/0.log" Mar 21 05:26:25 crc kubenswrapper[4839]: I0321 05:26:25.185208 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nmj8p_c4d393d7-42d7-4b7d-a3cd-f7e325b97c54/machine-api-operator/0.log" Mar 21 05:26:36 crc kubenswrapper[4839]: I0321 05:26:36.496208 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-x2cpt_daed7a16-7023-463e-9d60-3f56f091f73e/cert-manager-controller/0.log" Mar 21 05:26:36 crc kubenswrapper[4839]: I0321 05:26:36.688658 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-v297k_814a91ac-5e2f-4479-88a3-254e4216e50c/cert-manager-cainjector/0.log" Mar 21 05:26:36 crc kubenswrapper[4839]: I0321 05:26:36.771790 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-s9zj6_d70f5b8f-f5a8-4829-b4e1-7a7a12dddd1f/cert-manager-webhook/0.log" Mar 21 05:26:48 crc kubenswrapper[4839]: I0321 05:26:48.822912 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-j5z4g_8e7a66bb-3731-4f75-9a7f-5b9d07a36b39/nmstate-console-plugin/0.log" Mar 21 05:26:49 crc kubenswrapper[4839]: I0321 05:26:49.030770 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-k57vv_42329e42-8b9b-45ed-ab04-bf12468d8859/nmstate-handler/0.log" Mar 21 05:26:49 crc kubenswrapper[4839]: I0321 05:26:49.112350 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-z5wkc_fdc1639d-742f-41a6-8cb7-318997a4a8b1/nmstate-metrics/0.log" Mar 21 05:26:49 crc kubenswrapper[4839]: I0321 05:26:49.138285 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-z5wkc_fdc1639d-742f-41a6-8cb7-318997a4a8b1/kube-rbac-proxy/0.log" Mar 21 05:26:49 crc kubenswrapper[4839]: I0321 05:26:49.259022 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-vrlf4_fbd83ba5-ac43-45f6-8a15-78ba82a246f7/nmstate-operator/0.log" Mar 21 05:26:49 crc kubenswrapper[4839]: I0321 05:26:49.304867 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-7ghd4_5a2485ca-cb21-4edf-b074-f7ac255f45f8/nmstate-webhook/0.log" Mar 21 05:26:57 crc kubenswrapper[4839]: I0321 05:26:57.659516 4839 scope.go:117] "RemoveContainer" containerID="229d14a49fd481dc353dc5d371b3e82a7f1a7396db2fffc8de8355fe9e2338cb" Mar 21 05:27:15 crc kubenswrapper[4839]: I0321 05:27:15.706658 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-q9zb9_f0373e22-a3f9-48c6-abd6-fc8147ea49e6/kube-rbac-proxy/0.log" Mar 21 05:27:15 crc kubenswrapper[4839]: I0321 05:27:15.855823 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-q9zb9_f0373e22-a3f9-48c6-abd6-fc8147ea49e6/controller/0.log" Mar 21 05:27:15 crc kubenswrapper[4839]: I0321 05:27:15.939521 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-frr-files/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.104935 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-reloader/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.148773 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-frr-files/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.154129 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-metrics/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.175869 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-reloader/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.376967 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-metrics/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.389622 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-reloader/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.419342 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-metrics/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.464993 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-frr-files/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.587101 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-frr-files/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.589892 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-metrics/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.609159 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-reloader/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.691706 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/controller/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.769918 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/kube-rbac-proxy/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.799551 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/frr-metrics/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.910642 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/kube-rbac-proxy-frr/0.log" Mar 21 05:27:16 crc kubenswrapper[4839]: I0321 05:27:16.974953 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/reloader/0.log" Mar 21 05:27:17 crc kubenswrapper[4839]: I0321 05:27:17.139414 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-qm7jb_06b3d06a-d515-469a-9a88-77b3f1e6c6f0/frr-k8s-webhook-server/0.log" Mar 21 05:27:17 crc kubenswrapper[4839]: I0321 05:27:17.345499 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b8d865685-2pk4g_888cdc0b-241d-456a-9a9f-3ed253b3dbf3/manager/0.log" Mar 21 05:27:17 crc kubenswrapper[4839]: I0321 05:27:17.434900 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7df97b96d6-7wvzr_ca0627e2-8115-4514-ba93-47e00a823a31/webhook-server/0.log" Mar 21 05:27:17 crc kubenswrapper[4839]: I0321 05:27:17.641085 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b2wb4_6b330e86-2ac2-4bee-8a6e-364cb2f093d7/kube-rbac-proxy/0.log" Mar 21 05:27:18 crc kubenswrapper[4839]: I0321 05:27:18.196856 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b2wb4_6b330e86-2ac2-4bee-8a6e-364cb2f093d7/speaker/0.log" Mar 21 05:27:18 crc kubenswrapper[4839]: I0321 05:27:18.525293 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/frr/0.log" Mar 21 05:27:30 crc kubenswrapper[4839]: I0321 05:27:30.132708 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/util/0.log" Mar 21 05:27:30 crc kubenswrapper[4839]: I0321 05:27:30.550692 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/util/0.log" Mar 21 05:27:30 crc kubenswrapper[4839]: I0321 05:27:30.551988 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/pull/0.log" Mar 21 05:27:30 crc kubenswrapper[4839]: I0321 05:27:30.569874 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/pull/0.log" Mar 21 05:27:30 crc kubenswrapper[4839]: I0321 05:27:30.755898 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/extract/0.log" Mar 21 05:27:30 crc kubenswrapper[4839]: I0321 05:27:30.765799 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/util/0.log" Mar 21 05:27:30 crc kubenswrapper[4839]: I0321 05:27:30.773485 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/pull/0.log" Mar 21 05:27:30 crc kubenswrapper[4839]: I0321 05:27:30.939915 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/util/0.log" Mar 21 05:27:31 crc kubenswrapper[4839]: I0321 05:27:31.567296 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/pull/0.log" Mar 21 05:27:31 crc kubenswrapper[4839]: I0321 05:27:31.598277 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/pull/0.log" Mar 21 05:27:31 crc kubenswrapper[4839]: I0321 05:27:31.605802 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/util/0.log" Mar 21 05:27:31 crc kubenswrapper[4839]: I0321 05:27:31.862228 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/pull/0.log" Mar 21 05:27:31 crc kubenswrapper[4839]: I0321 05:27:31.862687 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/extract/0.log" Mar 21 05:27:31 crc kubenswrapper[4839]: I0321 05:27:31.931457 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/util/0.log" Mar 21 05:27:32 crc kubenswrapper[4839]: I0321 05:27:32.047889 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-utilities/0.log" Mar 21 05:27:32 crc kubenswrapper[4839]: I0321 05:27:32.211753 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-utilities/0.log" Mar 21 05:27:32 crc kubenswrapper[4839]: I0321 05:27:32.242582 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-content/0.log" Mar 21 05:27:32 crc kubenswrapper[4839]: I0321 05:27:32.276844 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-content/0.log" Mar 21 05:27:32 crc kubenswrapper[4839]: I0321 05:27:32.463228 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-utilities/0.log" Mar 21 05:27:32 crc kubenswrapper[4839]: I0321 05:27:32.468470 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-content/0.log" Mar 21 05:27:32 crc kubenswrapper[4839]: I0321 05:27:32.700749 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-utilities/0.log" Mar 21 05:27:32 crc kubenswrapper[4839]: I0321 05:27:32.926735 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/registry-server/0.log" Mar 21 05:27:32 crc kubenswrapper[4839]: I0321 05:27:32.929241 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-content/0.log" Mar 21 05:27:32 crc kubenswrapper[4839]: I0321 05:27:32.992786 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-content/0.log" Mar 21 05:27:33 crc kubenswrapper[4839]: I0321 05:27:33.022156 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-utilities/0.log" Mar 21 05:27:33 crc kubenswrapper[4839]: I0321 05:27:33.109348 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-content/0.log" Mar 21 05:27:33 crc kubenswrapper[4839]: I0321 05:27:33.121945 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-utilities/0.log" Mar 21 05:27:33 crc kubenswrapper[4839]: I0321 05:27:33.365394 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qb9bp_df9bf95b-dc8f-4104-9c6c-873159393850/marketplace-operator/0.log" Mar 21 05:27:33 crc kubenswrapper[4839]: I0321 05:27:33.549193 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-utilities/0.log" Mar 21 05:27:33 crc kubenswrapper[4839]: I0321 05:27:33.794463 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-utilities/0.log" Mar 21 05:27:33 crc kubenswrapper[4839]: I0321 05:27:33.799829 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-content/0.log" Mar 21 05:27:33 crc kubenswrapper[4839]: I0321 05:27:33.815904 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-content/0.log" Mar 21 05:27:33 crc kubenswrapper[4839]: I0321 05:27:33.828575 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/registry-server/0.log" Mar 21 05:27:34 crc kubenswrapper[4839]: I0321 05:27:34.024892 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-content/0.log" Mar 21 05:27:34 crc kubenswrapper[4839]: I0321 05:27:34.025870 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-utilities/0.log" Mar 21 05:27:34 crc kubenswrapper[4839]: I0321 05:27:34.064287 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-utilities/0.log" Mar 21 05:27:34 crc kubenswrapper[4839]: I0321 05:27:34.286336 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-content/0.log" Mar 21 05:27:34 crc kubenswrapper[4839]: I0321 05:27:34.292420 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/registry-server/0.log" Mar 21 05:27:34 crc kubenswrapper[4839]: I0321 05:27:34.302699 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-utilities/0.log" Mar 21 05:27:34 crc kubenswrapper[4839]: I0321 05:27:34.325110 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-content/0.log" Mar 21 05:27:34 crc kubenswrapper[4839]: I0321 05:27:34.504288 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-content/0.log" Mar 21 05:27:34 crc kubenswrapper[4839]: I0321 05:27:34.509888 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-utilities/0.log" Mar 21 05:27:35 crc kubenswrapper[4839]: I0321 05:27:35.119402 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/registry-server/0.log" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.150538 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567848-p6wgd"] Mar 21 05:28:00 crc kubenswrapper[4839]: E0321 05:28:00.151465 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274043bb-38cf-435f-9cb1-01d194d34325" containerName="oc" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.151477 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="274043bb-38cf-435f-9cb1-01d194d34325" containerName="oc" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.151672 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="274043bb-38cf-435f-9cb1-01d194d34325" containerName="oc" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.152261 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567848-p6wgd" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.154387 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.154878 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.155208 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.161740 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567848-p6wgd"] Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.201011 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsqsn\" (UniqueName: \"kubernetes.io/projected/c8214f95-33aa-486b-bb82-915b2c5b2cf6-kube-api-access-tsqsn\") pod \"auto-csr-approver-29567848-p6wgd\" (UID: \"c8214f95-33aa-486b-bb82-915b2c5b2cf6\") " pod="openshift-infra/auto-csr-approver-29567848-p6wgd" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.306420 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsqsn\" (UniqueName: \"kubernetes.io/projected/c8214f95-33aa-486b-bb82-915b2c5b2cf6-kube-api-access-tsqsn\") pod \"auto-csr-approver-29567848-p6wgd\" (UID: \"c8214f95-33aa-486b-bb82-915b2c5b2cf6\") " pod="openshift-infra/auto-csr-approver-29567848-p6wgd" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.335388 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsqsn\" (UniqueName: \"kubernetes.io/projected/c8214f95-33aa-486b-bb82-915b2c5b2cf6-kube-api-access-tsqsn\") pod \"auto-csr-approver-29567848-p6wgd\" (UID: \"c8214f95-33aa-486b-bb82-915b2c5b2cf6\") " pod="openshift-infra/auto-csr-approver-29567848-p6wgd" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.476199 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567848-p6wgd" Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.980746 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:28:00 crc kubenswrapper[4839]: I0321 05:28:00.981117 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:28:01 crc kubenswrapper[4839]: I0321 05:28:01.102787 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567848-p6wgd"] Mar 21 05:28:01 crc kubenswrapper[4839]: I0321 05:28:01.637249 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567848-p6wgd" event={"ID":"c8214f95-33aa-486b-bb82-915b2c5b2cf6","Type":"ContainerStarted","Data":"bdc7f332bf5b36a06c38ba377e71ce315c1ab651db2e2b3b15234e9d9fa884ed"} Mar 21 05:28:02 crc kubenswrapper[4839]: I0321 05:28:02.647762 4839 generic.go:334] "Generic (PLEG): container finished" podID="c8214f95-33aa-486b-bb82-915b2c5b2cf6" containerID="3c4dbc17150a4b84d9f816e99c3c6823e1cf60ce3010cad74846a38e98f64886" exitCode=0 Mar 21 05:28:02 crc kubenswrapper[4839]: I0321 05:28:02.647873 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567848-p6wgd" event={"ID":"c8214f95-33aa-486b-bb82-915b2c5b2cf6","Type":"ContainerDied","Data":"3c4dbc17150a4b84d9f816e99c3c6823e1cf60ce3010cad74846a38e98f64886"} Mar 21 05:28:04 crc kubenswrapper[4839]: I0321 05:28:04.117405 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567848-p6wgd" Mar 21 05:28:04 crc kubenswrapper[4839]: I0321 05:28:04.286269 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsqsn\" (UniqueName: \"kubernetes.io/projected/c8214f95-33aa-486b-bb82-915b2c5b2cf6-kube-api-access-tsqsn\") pod \"c8214f95-33aa-486b-bb82-915b2c5b2cf6\" (UID: \"c8214f95-33aa-486b-bb82-915b2c5b2cf6\") " Mar 21 05:28:04 crc kubenswrapper[4839]: I0321 05:28:04.293136 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8214f95-33aa-486b-bb82-915b2c5b2cf6-kube-api-access-tsqsn" (OuterVolumeSpecName: "kube-api-access-tsqsn") pod "c8214f95-33aa-486b-bb82-915b2c5b2cf6" (UID: "c8214f95-33aa-486b-bb82-915b2c5b2cf6"). InnerVolumeSpecName "kube-api-access-tsqsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:28:04 crc kubenswrapper[4839]: I0321 05:28:04.391851 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsqsn\" (UniqueName: \"kubernetes.io/projected/c8214f95-33aa-486b-bb82-915b2c5b2cf6-kube-api-access-tsqsn\") on node \"crc\" DevicePath \"\"" Mar 21 05:28:04 crc kubenswrapper[4839]: I0321 05:28:04.672154 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567848-p6wgd" event={"ID":"c8214f95-33aa-486b-bb82-915b2c5b2cf6","Type":"ContainerDied","Data":"bdc7f332bf5b36a06c38ba377e71ce315c1ab651db2e2b3b15234e9d9fa884ed"} Mar 21 05:28:04 crc kubenswrapper[4839]: I0321 05:28:04.672507 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdc7f332bf5b36a06c38ba377e71ce315c1ab651db2e2b3b15234e9d9fa884ed" Mar 21 05:28:04 crc kubenswrapper[4839]: I0321 05:28:04.672328 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567848-p6wgd" Mar 21 05:28:05 crc kubenswrapper[4839]: I0321 05:28:05.187307 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567842-mkbkh"] Mar 21 05:28:05 crc kubenswrapper[4839]: I0321 05:28:05.196539 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567842-mkbkh"] Mar 21 05:28:06 crc kubenswrapper[4839]: I0321 05:28:06.465034 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98d91ef7-84b2-40fa-b268-b3a42085ecbd" path="/var/lib/kubelet/pods/98d91ef7-84b2-40fa-b268-b3a42085ecbd/volumes" Mar 21 05:28:30 crc kubenswrapper[4839]: I0321 05:28:30.980404 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:28:30 crc kubenswrapper[4839]: I0321 05:28:30.981071 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:28:57 crc kubenswrapper[4839]: I0321 05:28:57.755047 4839 scope.go:117] "RemoveContainer" containerID="fd8610a23aa4477b05f1e471927e591e3db28e8c730e3f65952a7cfd15d24ba8" Mar 21 05:28:57 crc kubenswrapper[4839]: I0321 05:28:57.777989 4839 scope.go:117] "RemoveContainer" containerID="97ce872ff52632fee3002b46fbe2d1087d0acb59cd9704873c29b470d51ff9e4" Mar 21 05:28:57 crc kubenswrapper[4839]: I0321 05:28:57.824851 4839 scope.go:117] "RemoveContainer" containerID="eedac11fd6ab65c2edef01ab4734c0cb94cc7a431c8be7bf1c6cc4417de55aa3" Mar 21 05:28:57 crc kubenswrapper[4839]: I0321 05:28:57.902512 4839 scope.go:117] "RemoveContainer" containerID="65a7d17f89d1557c72c5ffd06bb72faeb0e67e3bd8925184b20df8ed6afa7a8d" Mar 21 05:29:00 crc kubenswrapper[4839]: I0321 05:29:00.980271 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:29:00 crc kubenswrapper[4839]: I0321 05:29:00.981258 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:29:00 crc kubenswrapper[4839]: I0321 05:29:00.981323 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 05:29:00 crc kubenswrapper[4839]: I0321 05:29:00.982343 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:29:00 crc kubenswrapper[4839]: I0321 05:29:00.982402 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" gracePeriod=600 Mar 21 05:29:01 crc kubenswrapper[4839]: E0321 05:29:01.110068 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:29:01 crc kubenswrapper[4839]: I0321 05:29:01.313145 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" exitCode=0 Mar 21 05:29:01 crc kubenswrapper[4839]: I0321 05:29:01.313208 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d"} Mar 21 05:29:01 crc kubenswrapper[4839]: I0321 05:29:01.313260 4839 scope.go:117] "RemoveContainer" containerID="26d5ad8d8c206d8ada93506f3a162dccbd9846e40dd3da26db34bab6bbf70437" Mar 21 05:29:01 crc kubenswrapper[4839]: I0321 05:29:01.314114 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:29:01 crc kubenswrapper[4839]: E0321 05:29:01.314471 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:29:12 crc kubenswrapper[4839]: I0321 05:29:12.452630 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:29:12 crc kubenswrapper[4839]: E0321 05:29:12.453402 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:29:25 crc kubenswrapper[4839]: I0321 05:29:25.453068 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:29:25 crc kubenswrapper[4839]: E0321 05:29:25.453867 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:29:28 crc kubenswrapper[4839]: I0321 05:29:28.595410 4839 generic.go:334] "Generic (PLEG): container finished" podID="de78e0a8-6c32-44ae-8f44-443eb0f1dd25" containerID="4aacadbb7c340286a8bc5bb514479c70f999217550d1118742a4cf28f857f96a" exitCode=0 Mar 21 05:29:28 crc kubenswrapper[4839]: I0321 05:29:28.595499 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-72qrq/must-gather-mxjpl" event={"ID":"de78e0a8-6c32-44ae-8f44-443eb0f1dd25","Type":"ContainerDied","Data":"4aacadbb7c340286a8bc5bb514479c70f999217550d1118742a4cf28f857f96a"} Mar 21 05:29:28 crc kubenswrapper[4839]: I0321 05:29:28.596823 4839 scope.go:117] "RemoveContainer" containerID="4aacadbb7c340286a8bc5bb514479c70f999217550d1118742a4cf28f857f96a" Mar 21 05:29:28 crc kubenswrapper[4839]: I0321 05:29:28.927811 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-72qrq_must-gather-mxjpl_de78e0a8-6c32-44ae-8f44-443eb0f1dd25/gather/0.log" Mar 21 05:29:37 crc kubenswrapper[4839]: I0321 05:29:37.278876 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-72qrq/must-gather-mxjpl"] Mar 21 05:29:37 crc kubenswrapper[4839]: I0321 05:29:37.279671 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-72qrq/must-gather-mxjpl" podUID="de78e0a8-6c32-44ae-8f44-443eb0f1dd25" containerName="copy" containerID="cri-o://0acccee08d8e21b640f974b3184f1317711fe544ebde8a6ac1eadbd5cddfd459" gracePeriod=2 Mar 21 05:29:37 crc kubenswrapper[4839]: I0321 05:29:37.291004 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-72qrq/must-gather-mxjpl"] Mar 21 05:29:37 crc kubenswrapper[4839]: I0321 05:29:37.682795 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-72qrq_must-gather-mxjpl_de78e0a8-6c32-44ae-8f44-443eb0f1dd25/copy/0.log" Mar 21 05:29:37 crc kubenswrapper[4839]: I0321 05:29:37.683264 4839 generic.go:334] "Generic (PLEG): container finished" podID="de78e0a8-6c32-44ae-8f44-443eb0f1dd25" containerID="0acccee08d8e21b640f974b3184f1317711fe544ebde8a6ac1eadbd5cddfd459" exitCode=143 Mar 21 05:29:37 crc kubenswrapper[4839]: I0321 05:29:37.829195 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-72qrq_must-gather-mxjpl_de78e0a8-6c32-44ae-8f44-443eb0f1dd25/copy/0.log" Mar 21 05:29:37 crc kubenswrapper[4839]: I0321 05:29:37.830715 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/must-gather-mxjpl" Mar 21 05:29:37 crc kubenswrapper[4839]: I0321 05:29:37.936650 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4s42\" (UniqueName: \"kubernetes.io/projected/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-kube-api-access-t4s42\") pod \"de78e0a8-6c32-44ae-8f44-443eb0f1dd25\" (UID: \"de78e0a8-6c32-44ae-8f44-443eb0f1dd25\") " Mar 21 05:29:37 crc kubenswrapper[4839]: I0321 05:29:37.936780 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-must-gather-output\") pod \"de78e0a8-6c32-44ae-8f44-443eb0f1dd25\" (UID: \"de78e0a8-6c32-44ae-8f44-443eb0f1dd25\") " Mar 21 05:29:37 crc kubenswrapper[4839]: I0321 05:29:37.943186 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-kube-api-access-t4s42" (OuterVolumeSpecName: "kube-api-access-t4s42") pod "de78e0a8-6c32-44ae-8f44-443eb0f1dd25" (UID: "de78e0a8-6c32-44ae-8f44-443eb0f1dd25"). InnerVolumeSpecName "kube-api-access-t4s42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:29:38 crc kubenswrapper[4839]: I0321 05:29:38.039342 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4s42\" (UniqueName: \"kubernetes.io/projected/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-kube-api-access-t4s42\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:38 crc kubenswrapper[4839]: I0321 05:29:38.093993 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "de78e0a8-6c32-44ae-8f44-443eb0f1dd25" (UID: "de78e0a8-6c32-44ae-8f44-443eb0f1dd25"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:29:38 crc kubenswrapper[4839]: I0321 05:29:38.141228 4839 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/de78e0a8-6c32-44ae-8f44-443eb0f1dd25-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 21 05:29:38 crc kubenswrapper[4839]: I0321 05:29:38.453264 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:29:38 crc kubenswrapper[4839]: E0321 05:29:38.455082 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:29:38 crc kubenswrapper[4839]: I0321 05:29:38.465634 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de78e0a8-6c32-44ae-8f44-443eb0f1dd25" path="/var/lib/kubelet/pods/de78e0a8-6c32-44ae-8f44-443eb0f1dd25/volumes" Mar 21 05:29:38 crc kubenswrapper[4839]: I0321 05:29:38.694416 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-72qrq_must-gather-mxjpl_de78e0a8-6c32-44ae-8f44-443eb0f1dd25/copy/0.log" Mar 21 05:29:38 crc kubenswrapper[4839]: I0321 05:29:38.695217 4839 scope.go:117] "RemoveContainer" containerID="0acccee08d8e21b640f974b3184f1317711fe544ebde8a6ac1eadbd5cddfd459" Mar 21 05:29:38 crc kubenswrapper[4839]: I0321 05:29:38.695329 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-72qrq/must-gather-mxjpl" Mar 21 05:29:38 crc kubenswrapper[4839]: I0321 05:29:38.716615 4839 scope.go:117] "RemoveContainer" containerID="4aacadbb7c340286a8bc5bb514479c70f999217550d1118742a4cf28f857f96a" Mar 21 05:29:50 crc kubenswrapper[4839]: I0321 05:29:50.453830 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:29:50 crc kubenswrapper[4839]: E0321 05:29:50.454843 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.156809 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567850-6q472"] Mar 21 05:30:00 crc kubenswrapper[4839]: E0321 05:30:00.158017 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8214f95-33aa-486b-bb82-915b2c5b2cf6" containerName="oc" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.158038 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8214f95-33aa-486b-bb82-915b2c5b2cf6" containerName="oc" Mar 21 05:30:00 crc kubenswrapper[4839]: E0321 05:30:00.158058 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de78e0a8-6c32-44ae-8f44-443eb0f1dd25" containerName="copy" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.158066 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="de78e0a8-6c32-44ae-8f44-443eb0f1dd25" containerName="copy" Mar 21 05:30:00 crc kubenswrapper[4839]: E0321 05:30:00.158100 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de78e0a8-6c32-44ae-8f44-443eb0f1dd25" containerName="gather" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.158109 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="de78e0a8-6c32-44ae-8f44-443eb0f1dd25" containerName="gather" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.158345 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8214f95-33aa-486b-bb82-915b2c5b2cf6" containerName="oc" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.158363 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="de78e0a8-6c32-44ae-8f44-443eb0f1dd25" containerName="gather" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.158385 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="de78e0a8-6c32-44ae-8f44-443eb0f1dd25" containerName="copy" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.159152 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567850-6q472" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.162964 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.163370 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.163592 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.166753 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567850-6q472"] Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.254361 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z"] Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.255509 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.257718 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.257991 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.271715 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z"] Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.285477 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkbxl\" (UniqueName: \"kubernetes.io/projected/5ff27433-bc42-4edf-bcac-48ffe5e0680a-kube-api-access-hkbxl\") pod \"auto-csr-approver-29567850-6q472\" (UID: \"5ff27433-bc42-4edf-bcac-48ffe5e0680a\") " pod="openshift-infra/auto-csr-approver-29567850-6q472" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.388203 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkbxl\" (UniqueName: \"kubernetes.io/projected/5ff27433-bc42-4edf-bcac-48ffe5e0680a-kube-api-access-hkbxl\") pod \"auto-csr-approver-29567850-6q472\" (UID: \"5ff27433-bc42-4edf-bcac-48ffe5e0680a\") " pod="openshift-infra/auto-csr-approver-29567850-6q472" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.388351 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5068c229-fbb7-489b-909b-767dd8db6c26-secret-volume\") pod \"collect-profiles-29567850-ttg6z\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.390208 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5068c229-fbb7-489b-909b-767dd8db6c26-config-volume\") pod \"collect-profiles-29567850-ttg6z\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.390376 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfdj7\" (UniqueName: \"kubernetes.io/projected/5068c229-fbb7-489b-909b-767dd8db6c26-kube-api-access-bfdj7\") pod \"collect-profiles-29567850-ttg6z\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.420188 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkbxl\" (UniqueName: \"kubernetes.io/projected/5ff27433-bc42-4edf-bcac-48ffe5e0680a-kube-api-access-hkbxl\") pod \"auto-csr-approver-29567850-6q472\" (UID: \"5ff27433-bc42-4edf-bcac-48ffe5e0680a\") " pod="openshift-infra/auto-csr-approver-29567850-6q472" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.486208 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567850-6q472" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.492765 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfdj7\" (UniqueName: \"kubernetes.io/projected/5068c229-fbb7-489b-909b-767dd8db6c26-kube-api-access-bfdj7\") pod \"collect-profiles-29567850-ttg6z\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.492872 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5068c229-fbb7-489b-909b-767dd8db6c26-secret-volume\") pod \"collect-profiles-29567850-ttg6z\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.493037 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5068c229-fbb7-489b-909b-767dd8db6c26-config-volume\") pod \"collect-profiles-29567850-ttg6z\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.494401 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5068c229-fbb7-489b-909b-767dd8db6c26-config-volume\") pod \"collect-profiles-29567850-ttg6z\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.499779 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5068c229-fbb7-489b-909b-767dd8db6c26-secret-volume\") pod \"collect-profiles-29567850-ttg6z\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.515225 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfdj7\" (UniqueName: \"kubernetes.io/projected/5068c229-fbb7-489b-909b-767dd8db6c26-kube-api-access-bfdj7\") pod \"collect-profiles-29567850-ttg6z\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.588900 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.926617 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567850-6q472"] Mar 21 05:30:00 crc kubenswrapper[4839]: I0321 05:30:00.937201 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:30:01 crc kubenswrapper[4839]: I0321 05:30:01.036760 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z"] Mar 21 05:30:01 crc kubenswrapper[4839]: W0321 05:30:01.043191 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5068c229_fbb7_489b_909b_767dd8db6c26.slice/crio-5d3df8ab1decb57324635594557afe3da3713c265cdc38fdb99134b723d707d6 WatchSource:0}: Error finding container 5d3df8ab1decb57324635594557afe3da3713c265cdc38fdb99134b723d707d6: Status 404 returned error can't find the container with id 5d3df8ab1decb57324635594557afe3da3713c265cdc38fdb99134b723d707d6 Mar 21 05:30:01 crc kubenswrapper[4839]: I0321 05:30:01.904959 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" event={"ID":"5068c229-fbb7-489b-909b-767dd8db6c26","Type":"ContainerStarted","Data":"795ea2bdd38729415d4fdc09f70f45e99f6b013a9dbbd17157a70057466b2a66"} Mar 21 05:30:01 crc kubenswrapper[4839]: I0321 05:30:01.905259 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" event={"ID":"5068c229-fbb7-489b-909b-767dd8db6c26","Type":"ContainerStarted","Data":"5d3df8ab1decb57324635594557afe3da3713c265cdc38fdb99134b723d707d6"} Mar 21 05:30:01 crc kubenswrapper[4839]: I0321 05:30:01.907402 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567850-6q472" event={"ID":"5ff27433-bc42-4edf-bcac-48ffe5e0680a","Type":"ContainerStarted","Data":"6a6c7d9e27ba4cfe8063fb41dea20f7f9e9da39d4362c1756955ba046915f307"} Mar 21 05:30:01 crc kubenswrapper[4839]: I0321 05:30:01.923551 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" podStartSLOduration=1.9235247 podStartE2EDuration="1.9235247s" podCreationTimestamp="2026-03-21 05:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:30:01.918241971 +0000 UTC m=+4006.246028647" watchObservedRunningTime="2026-03-21 05:30:01.9235247 +0000 UTC m=+4006.251311376" Mar 21 05:30:02 crc kubenswrapper[4839]: I0321 05:30:02.923728 4839 generic.go:334] "Generic (PLEG): container finished" podID="5068c229-fbb7-489b-909b-767dd8db6c26" containerID="795ea2bdd38729415d4fdc09f70f45e99f6b013a9dbbd17157a70057466b2a66" exitCode=0 Mar 21 05:30:02 crc kubenswrapper[4839]: I0321 05:30:02.924074 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" event={"ID":"5068c229-fbb7-489b-909b-767dd8db6c26","Type":"ContainerDied","Data":"795ea2bdd38729415d4fdc09f70f45e99f6b013a9dbbd17157a70057466b2a66"} Mar 21 05:30:03 crc kubenswrapper[4839]: I0321 05:30:03.452981 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:30:03 crc kubenswrapper[4839]: E0321 05:30:03.453474 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:30:03 crc kubenswrapper[4839]: I0321 05:30:03.933370 4839 generic.go:334] "Generic (PLEG): container finished" podID="5ff27433-bc42-4edf-bcac-48ffe5e0680a" containerID="32ef2594966320293c7652dfc99c30b2eedf27f32e9592ed12c4d3d92de56d1a" exitCode=0 Mar 21 05:30:03 crc kubenswrapper[4839]: I0321 05:30:03.933452 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567850-6q472" event={"ID":"5ff27433-bc42-4edf-bcac-48ffe5e0680a","Type":"ContainerDied","Data":"32ef2594966320293c7652dfc99c30b2eedf27f32e9592ed12c4d3d92de56d1a"} Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.273926 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.372004 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5068c229-fbb7-489b-909b-767dd8db6c26-secret-volume\") pod \"5068c229-fbb7-489b-909b-767dd8db6c26\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.372373 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5068c229-fbb7-489b-909b-767dd8db6c26-config-volume\") pod \"5068c229-fbb7-489b-909b-767dd8db6c26\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.372485 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfdj7\" (UniqueName: \"kubernetes.io/projected/5068c229-fbb7-489b-909b-767dd8db6c26-kube-api-access-bfdj7\") pod \"5068c229-fbb7-489b-909b-767dd8db6c26\" (UID: \"5068c229-fbb7-489b-909b-767dd8db6c26\") " Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.373174 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5068c229-fbb7-489b-909b-767dd8db6c26-config-volume" (OuterVolumeSpecName: "config-volume") pod "5068c229-fbb7-489b-909b-767dd8db6c26" (UID: "5068c229-fbb7-489b-909b-767dd8db6c26"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.378270 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5068c229-fbb7-489b-909b-767dd8db6c26-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5068c229-fbb7-489b-909b-767dd8db6c26" (UID: "5068c229-fbb7-489b-909b-767dd8db6c26"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.385352 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5068c229-fbb7-489b-909b-767dd8db6c26-kube-api-access-bfdj7" (OuterVolumeSpecName: "kube-api-access-bfdj7") pod "5068c229-fbb7-489b-909b-767dd8db6c26" (UID: "5068c229-fbb7-489b-909b-767dd8db6c26"). InnerVolumeSpecName "kube-api-access-bfdj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.477828 4839 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5068c229-fbb7-489b-909b-767dd8db6c26-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.477873 4839 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5068c229-fbb7-489b-909b-767dd8db6c26-config-volume\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.477889 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfdj7\" (UniqueName: \"kubernetes.io/projected/5068c229-fbb7-489b-909b-767dd8db6c26-kube-api-access-bfdj7\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.942530 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.942515 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567850-ttg6z" event={"ID":"5068c229-fbb7-489b-909b-767dd8db6c26","Type":"ContainerDied","Data":"5d3df8ab1decb57324635594557afe3da3713c265cdc38fdb99134b723d707d6"} Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.942998 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d3df8ab1decb57324635594557afe3da3713c265cdc38fdb99134b723d707d6" Mar 21 05:30:04 crc kubenswrapper[4839]: I0321 05:30:04.995793 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j"] Mar 21 05:30:05 crc kubenswrapper[4839]: I0321 05:30:05.004703 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567805-9sr8j"] Mar 21 05:30:05 crc kubenswrapper[4839]: I0321 05:30:05.288745 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567850-6q472" Mar 21 05:30:05 crc kubenswrapper[4839]: I0321 05:30:05.299658 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkbxl\" (UniqueName: \"kubernetes.io/projected/5ff27433-bc42-4edf-bcac-48ffe5e0680a-kube-api-access-hkbxl\") pod \"5ff27433-bc42-4edf-bcac-48ffe5e0680a\" (UID: \"5ff27433-bc42-4edf-bcac-48ffe5e0680a\") " Mar 21 05:30:05 crc kubenswrapper[4839]: I0321 05:30:05.307006 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ff27433-bc42-4edf-bcac-48ffe5e0680a-kube-api-access-hkbxl" (OuterVolumeSpecName: "kube-api-access-hkbxl") pod "5ff27433-bc42-4edf-bcac-48ffe5e0680a" (UID: "5ff27433-bc42-4edf-bcac-48ffe5e0680a"). InnerVolumeSpecName "kube-api-access-hkbxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:30:05 crc kubenswrapper[4839]: I0321 05:30:05.402171 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkbxl\" (UniqueName: \"kubernetes.io/projected/5ff27433-bc42-4edf-bcac-48ffe5e0680a-kube-api-access-hkbxl\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:05 crc kubenswrapper[4839]: I0321 05:30:05.954750 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567850-6q472" event={"ID":"5ff27433-bc42-4edf-bcac-48ffe5e0680a","Type":"ContainerDied","Data":"6a6c7d9e27ba4cfe8063fb41dea20f7f9e9da39d4362c1756955ba046915f307"} Mar 21 05:30:05 crc kubenswrapper[4839]: I0321 05:30:05.955168 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a6c7d9e27ba4cfe8063fb41dea20f7f9e9da39d4362c1756955ba046915f307" Mar 21 05:30:05 crc kubenswrapper[4839]: I0321 05:30:05.954939 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567850-6q472" Mar 21 05:30:06 crc kubenswrapper[4839]: I0321 05:30:06.355155 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567844-qzhl5"] Mar 21 05:30:06 crc kubenswrapper[4839]: I0321 05:30:06.363362 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567844-qzhl5"] Mar 21 05:30:06 crc kubenswrapper[4839]: I0321 05:30:06.464221 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47d5a79e-3e14-4d49-bed4-a9c49e7b7f26" path="/var/lib/kubelet/pods/47d5a79e-3e14-4d49-bed4-a9c49e7b7f26/volumes" Mar 21 05:30:06 crc kubenswrapper[4839]: I0321 05:30:06.464953 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad5b8a95-e16d-42a8-9069-5294c8934559" path="/var/lib/kubelet/pods/ad5b8a95-e16d-42a8-9069-5294c8934559/volumes" Mar 21 05:30:13 crc kubenswrapper[4839]: I0321 05:30:13.989364 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-csd6j"] Mar 21 05:30:13 crc kubenswrapper[4839]: E0321 05:30:13.990333 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ff27433-bc42-4edf-bcac-48ffe5e0680a" containerName="oc" Mar 21 05:30:13 crc kubenswrapper[4839]: I0321 05:30:13.990346 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ff27433-bc42-4edf-bcac-48ffe5e0680a" containerName="oc" Mar 21 05:30:13 crc kubenswrapper[4839]: E0321 05:30:13.990361 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5068c229-fbb7-489b-909b-767dd8db6c26" containerName="collect-profiles" Mar 21 05:30:13 crc kubenswrapper[4839]: I0321 05:30:13.990369 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5068c229-fbb7-489b-909b-767dd8db6c26" containerName="collect-profiles" Mar 21 05:30:13 crc kubenswrapper[4839]: I0321 05:30:13.990528 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5068c229-fbb7-489b-909b-767dd8db6c26" containerName="collect-profiles" Mar 21 05:30:13 crc kubenswrapper[4839]: I0321 05:30:13.990551 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ff27433-bc42-4edf-bcac-48ffe5e0680a" containerName="oc" Mar 21 05:30:13 crc kubenswrapper[4839]: I0321 05:30:13.991815 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.002540 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-csd6j"] Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.068391 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx8sz\" (UniqueName: \"kubernetes.io/projected/a52535f1-c597-4bcd-9cdf-b51230e45194-kube-api-access-bx8sz\") pod \"redhat-operators-csd6j\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.068466 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-catalog-content\") pod \"redhat-operators-csd6j\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.068912 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-utilities\") pod \"redhat-operators-csd6j\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.171990 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-utilities\") pod \"redhat-operators-csd6j\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.172217 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx8sz\" (UniqueName: \"kubernetes.io/projected/a52535f1-c597-4bcd-9cdf-b51230e45194-kube-api-access-bx8sz\") pod \"redhat-operators-csd6j\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.172290 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-catalog-content\") pod \"redhat-operators-csd6j\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.172551 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-utilities\") pod \"redhat-operators-csd6j\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.172919 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-catalog-content\") pod \"redhat-operators-csd6j\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.193629 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx8sz\" (UniqueName: \"kubernetes.io/projected/a52535f1-c597-4bcd-9cdf-b51230e45194-kube-api-access-bx8sz\") pod \"redhat-operators-csd6j\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.316051 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:14 crc kubenswrapper[4839]: I0321 05:30:14.824549 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-csd6j"] Mar 21 05:30:15 crc kubenswrapper[4839]: I0321 05:30:15.028885 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csd6j" event={"ID":"a52535f1-c597-4bcd-9cdf-b51230e45194","Type":"ContainerStarted","Data":"11bf8d4811bd37333961fab92452c6a2b0698d79539b44b206fdc039f7461b0b"} Mar 21 05:30:15 crc kubenswrapper[4839]: I0321 05:30:15.028939 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csd6j" event={"ID":"a52535f1-c597-4bcd-9cdf-b51230e45194","Type":"ContainerStarted","Data":"b0864809c14aa0a411344168780ec1f23fecd6a5710ca66679f834831778488d"} Mar 21 05:30:16 crc kubenswrapper[4839]: I0321 05:30:16.038930 4839 generic.go:334] "Generic (PLEG): container finished" podID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerID="11bf8d4811bd37333961fab92452c6a2b0698d79539b44b206fdc039f7461b0b" exitCode=0 Mar 21 05:30:16 crc kubenswrapper[4839]: I0321 05:30:16.039018 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csd6j" event={"ID":"a52535f1-c597-4bcd-9cdf-b51230e45194","Type":"ContainerDied","Data":"11bf8d4811bd37333961fab92452c6a2b0698d79539b44b206fdc039f7461b0b"} Mar 21 05:30:17 crc kubenswrapper[4839]: I0321 05:30:17.049433 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csd6j" event={"ID":"a52535f1-c597-4bcd-9cdf-b51230e45194","Type":"ContainerStarted","Data":"29c4e7c16b734f5d55db99e70306de25f6bd3ce8412aef36e66b3ae3fba4aa67"} Mar 21 05:30:17 crc kubenswrapper[4839]: I0321 05:30:17.452648 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:30:17 crc kubenswrapper[4839]: E0321 05:30:17.453306 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:30:18 crc kubenswrapper[4839]: I0321 05:30:18.064068 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csd6j" event={"ID":"a52535f1-c597-4bcd-9cdf-b51230e45194","Type":"ContainerDied","Data":"29c4e7c16b734f5d55db99e70306de25f6bd3ce8412aef36e66b3ae3fba4aa67"} Mar 21 05:30:18 crc kubenswrapper[4839]: I0321 05:30:18.063827 4839 generic.go:334] "Generic (PLEG): container finished" podID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerID="29c4e7c16b734f5d55db99e70306de25f6bd3ce8412aef36e66b3ae3fba4aa67" exitCode=0 Mar 21 05:30:19 crc kubenswrapper[4839]: I0321 05:30:19.077088 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csd6j" event={"ID":"a52535f1-c597-4bcd-9cdf-b51230e45194","Type":"ContainerStarted","Data":"7c3986fc2d374298329091f03d0df61db6a6535375bb6b72b1d989a838c16ce3"} Mar 21 05:30:19 crc kubenswrapper[4839]: I0321 05:30:19.100591 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-csd6j" podStartSLOduration=3.518213369 podStartE2EDuration="6.100553326s" podCreationTimestamp="2026-03-21 05:30:13 +0000 UTC" firstStartedPulling="2026-03-21 05:30:16.040875798 +0000 UTC m=+4020.368662474" lastFinishedPulling="2026-03-21 05:30:18.623215755 +0000 UTC m=+4022.951002431" observedRunningTime="2026-03-21 05:30:19.097223872 +0000 UTC m=+4023.425010548" watchObservedRunningTime="2026-03-21 05:30:19.100553326 +0000 UTC m=+4023.428340002" Mar 21 05:30:24 crc kubenswrapper[4839]: I0321 05:30:24.317027 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:24 crc kubenswrapper[4839]: I0321 05:30:24.317530 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:25 crc kubenswrapper[4839]: I0321 05:30:25.364062 4839 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-csd6j" podUID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerName="registry-server" probeResult="failure" output=< Mar 21 05:30:25 crc kubenswrapper[4839]: timeout: failed to connect service ":50051" within 1s Mar 21 05:30:25 crc kubenswrapper[4839]: > Mar 21 05:30:28 crc kubenswrapper[4839]: I0321 05:30:28.452699 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:30:28 crc kubenswrapper[4839]: E0321 05:30:28.453519 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:30:34 crc kubenswrapper[4839]: I0321 05:30:34.379271 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:34 crc kubenswrapper[4839]: I0321 05:30:34.439246 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:34 crc kubenswrapper[4839]: I0321 05:30:34.618148 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-csd6j"] Mar 21 05:30:36 crc kubenswrapper[4839]: I0321 05:30:36.238961 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-csd6j" podUID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerName="registry-server" containerID="cri-o://7c3986fc2d374298329091f03d0df61db6a6535375bb6b72b1d989a838c16ce3" gracePeriod=2 Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.254069 4839 generic.go:334] "Generic (PLEG): container finished" podID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerID="7c3986fc2d374298329091f03d0df61db6a6535375bb6b72b1d989a838c16ce3" exitCode=0 Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.254698 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csd6j" event={"ID":"a52535f1-c597-4bcd-9cdf-b51230e45194","Type":"ContainerDied","Data":"7c3986fc2d374298329091f03d0df61db6a6535375bb6b72b1d989a838c16ce3"} Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.374340 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.444493 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-utilities\") pod \"a52535f1-c597-4bcd-9cdf-b51230e45194\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.444645 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx8sz\" (UniqueName: \"kubernetes.io/projected/a52535f1-c597-4bcd-9cdf-b51230e45194-kube-api-access-bx8sz\") pod \"a52535f1-c597-4bcd-9cdf-b51230e45194\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.444679 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-catalog-content\") pod \"a52535f1-c597-4bcd-9cdf-b51230e45194\" (UID: \"a52535f1-c597-4bcd-9cdf-b51230e45194\") " Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.445733 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-utilities" (OuterVolumeSpecName: "utilities") pod "a52535f1-c597-4bcd-9cdf-b51230e45194" (UID: "a52535f1-c597-4bcd-9cdf-b51230e45194"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.453088 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a52535f1-c597-4bcd-9cdf-b51230e45194-kube-api-access-bx8sz" (OuterVolumeSpecName: "kube-api-access-bx8sz") pod "a52535f1-c597-4bcd-9cdf-b51230e45194" (UID: "a52535f1-c597-4bcd-9cdf-b51230e45194"). InnerVolumeSpecName "kube-api-access-bx8sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.546803 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.546861 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx8sz\" (UniqueName: \"kubernetes.io/projected/a52535f1-c597-4bcd-9cdf-b51230e45194-kube-api-access-bx8sz\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.593523 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a52535f1-c597-4bcd-9cdf-b51230e45194" (UID: "a52535f1-c597-4bcd-9cdf-b51230e45194"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:30:37 crc kubenswrapper[4839]: I0321 05:30:37.648383 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a52535f1-c597-4bcd-9cdf-b51230e45194-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:30:38 crc kubenswrapper[4839]: I0321 05:30:38.275212 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csd6j" event={"ID":"a52535f1-c597-4bcd-9cdf-b51230e45194","Type":"ContainerDied","Data":"b0864809c14aa0a411344168780ec1f23fecd6a5710ca66679f834831778488d"} Mar 21 05:30:38 crc kubenswrapper[4839]: I0321 05:30:38.276288 4839 scope.go:117] "RemoveContainer" containerID="7c3986fc2d374298329091f03d0df61db6a6535375bb6b72b1d989a838c16ce3" Mar 21 05:30:38 crc kubenswrapper[4839]: I0321 05:30:38.276238 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csd6j" Mar 21 05:30:38 crc kubenswrapper[4839]: I0321 05:30:38.296878 4839 scope.go:117] "RemoveContainer" containerID="29c4e7c16b734f5d55db99e70306de25f6bd3ce8412aef36e66b3ae3fba4aa67" Mar 21 05:30:38 crc kubenswrapper[4839]: I0321 05:30:38.315227 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-csd6j"] Mar 21 05:30:38 crc kubenswrapper[4839]: I0321 05:30:38.338291 4839 scope.go:117] "RemoveContainer" containerID="11bf8d4811bd37333961fab92452c6a2b0698d79539b44b206fdc039f7461b0b" Mar 21 05:30:38 crc kubenswrapper[4839]: I0321 05:30:38.346106 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-csd6j"] Mar 21 05:30:38 crc kubenswrapper[4839]: I0321 05:30:38.463619 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a52535f1-c597-4bcd-9cdf-b51230e45194" path="/var/lib/kubelet/pods/a52535f1-c597-4bcd-9cdf-b51230e45194/volumes" Mar 21 05:30:41 crc kubenswrapper[4839]: I0321 05:30:41.452838 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:30:41 crc kubenswrapper[4839]: E0321 05:30:41.453719 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:30:54 crc kubenswrapper[4839]: I0321 05:30:54.453301 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:30:54 crc kubenswrapper[4839]: E0321 05:30:54.454481 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:30:58 crc kubenswrapper[4839]: I0321 05:30:58.029102 4839 scope.go:117] "RemoveContainer" containerID="77bf1caf6b0a8e86542e0854eb602cb5e02b5990b61a512100dca57b8da7f1d1" Mar 21 05:30:58 crc kubenswrapper[4839]: I0321 05:30:58.065495 4839 scope.go:117] "RemoveContainer" containerID="cb7937f2ae576fec589579ad2dd17797c203b9a5a4641193da2cc618f8fd881c" Mar 21 05:31:08 crc kubenswrapper[4839]: I0321 05:31:08.453909 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:31:08 crc kubenswrapper[4839]: E0321 05:31:08.454819 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:31:22 crc kubenswrapper[4839]: I0321 05:31:22.454740 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:31:22 crc kubenswrapper[4839]: E0321 05:31:22.456040 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:31:33 crc kubenswrapper[4839]: I0321 05:31:33.453310 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:31:33 crc kubenswrapper[4839]: E0321 05:31:33.454179 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:31:45 crc kubenswrapper[4839]: I0321 05:31:45.453171 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:31:45 crc kubenswrapper[4839]: E0321 05:31:45.454035 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.131647 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l4gg7"] Mar 21 05:31:48 crc kubenswrapper[4839]: E0321 05:31:48.136230 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerName="registry-server" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.136282 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerName="registry-server" Mar 21 05:31:48 crc kubenswrapper[4839]: E0321 05:31:48.136331 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerName="extract-utilities" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.136345 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerName="extract-utilities" Mar 21 05:31:48 crc kubenswrapper[4839]: E0321 05:31:48.136374 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerName="extract-content" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.136389 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerName="extract-content" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.137098 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52535f1-c597-4bcd-9cdf-b51230e45194" containerName="registry-server" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.139051 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.144077 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l4gg7"] Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.261422 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-catalog-content\") pod \"community-operators-l4gg7\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.261614 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwf7c\" (UniqueName: \"kubernetes.io/projected/06965c4c-a775-46ef-ac7b-9638ec75c419-kube-api-access-xwf7c\") pod \"community-operators-l4gg7\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.261649 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-utilities\") pod \"community-operators-l4gg7\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.363595 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwf7c\" (UniqueName: \"kubernetes.io/projected/06965c4c-a775-46ef-ac7b-9638ec75c419-kube-api-access-xwf7c\") pod \"community-operators-l4gg7\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.363667 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-utilities\") pod \"community-operators-l4gg7\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.363725 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-catalog-content\") pod \"community-operators-l4gg7\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.364403 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-utilities\") pod \"community-operators-l4gg7\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.364464 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-catalog-content\") pod \"community-operators-l4gg7\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.386098 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwf7c\" (UniqueName: \"kubernetes.io/projected/06965c4c-a775-46ef-ac7b-9638ec75c419-kube-api-access-xwf7c\") pod \"community-operators-l4gg7\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:48 crc kubenswrapper[4839]: I0321 05:31:48.475992 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:49 crc kubenswrapper[4839]: I0321 05:31:49.005966 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l4gg7"] Mar 21 05:31:49 crc kubenswrapper[4839]: I0321 05:31:49.932225 4839 generic.go:334] "Generic (PLEG): container finished" podID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerID="0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8" exitCode=0 Mar 21 05:31:49 crc kubenswrapper[4839]: I0321 05:31:49.932785 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4gg7" event={"ID":"06965c4c-a775-46ef-ac7b-9638ec75c419","Type":"ContainerDied","Data":"0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8"} Mar 21 05:31:49 crc kubenswrapper[4839]: I0321 05:31:49.932839 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4gg7" event={"ID":"06965c4c-a775-46ef-ac7b-9638ec75c419","Type":"ContainerStarted","Data":"6d21bf3853d86faa6fa835635204d84060fc0472abf70b72de3c10254b58b0e9"} Mar 21 05:31:51 crc kubenswrapper[4839]: I0321 05:31:51.954728 4839 generic.go:334] "Generic (PLEG): container finished" podID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerID="82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8" exitCode=0 Mar 21 05:31:51 crc kubenswrapper[4839]: I0321 05:31:51.954769 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4gg7" event={"ID":"06965c4c-a775-46ef-ac7b-9638ec75c419","Type":"ContainerDied","Data":"82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8"} Mar 21 05:31:52 crc kubenswrapper[4839]: I0321 05:31:52.966754 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4gg7" event={"ID":"06965c4c-a775-46ef-ac7b-9638ec75c419","Type":"ContainerStarted","Data":"e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2"} Mar 21 05:31:52 crc kubenswrapper[4839]: I0321 05:31:52.992612 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l4gg7" podStartSLOduration=2.600862572 podStartE2EDuration="4.992588859s" podCreationTimestamp="2026-03-21 05:31:48 +0000 UTC" firstStartedPulling="2026-03-21 05:31:49.941510284 +0000 UTC m=+4114.269296960" lastFinishedPulling="2026-03-21 05:31:52.333236571 +0000 UTC m=+4116.661023247" observedRunningTime="2026-03-21 05:31:52.988332079 +0000 UTC m=+4117.316118755" watchObservedRunningTime="2026-03-21 05:31:52.992588859 +0000 UTC m=+4117.320375535" Mar 21 05:31:56 crc kubenswrapper[4839]: I0321 05:31:56.464215 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:31:56 crc kubenswrapper[4839]: E0321 05:31:56.465078 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:31:58 crc kubenswrapper[4839]: I0321 05:31:58.476625 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:58 crc kubenswrapper[4839]: I0321 05:31:58.476934 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:58 crc kubenswrapper[4839]: I0321 05:31:58.535705 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:59 crc kubenswrapper[4839]: I0321 05:31:59.076726 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:31:59 crc kubenswrapper[4839]: I0321 05:31:59.126295 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l4gg7"] Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.151148 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567852-gb6qv"] Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.152354 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567852-gb6qv" Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.154516 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.154817 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.158366 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.167276 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567852-gb6qv"] Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.294673 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p56ld\" (UniqueName: \"kubernetes.io/projected/d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1-kube-api-access-p56ld\") pod \"auto-csr-approver-29567852-gb6qv\" (UID: \"d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1\") " pod="openshift-infra/auto-csr-approver-29567852-gb6qv" Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.397059 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p56ld\" (UniqueName: \"kubernetes.io/projected/d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1-kube-api-access-p56ld\") pod \"auto-csr-approver-29567852-gb6qv\" (UID: \"d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1\") " pod="openshift-infra/auto-csr-approver-29567852-gb6qv" Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.416910 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p56ld\" (UniqueName: \"kubernetes.io/projected/d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1-kube-api-access-p56ld\") pod \"auto-csr-approver-29567852-gb6qv\" (UID: \"d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1\") " pod="openshift-infra/auto-csr-approver-29567852-gb6qv" Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.492861 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567852-gb6qv" Mar 21 05:32:00 crc kubenswrapper[4839]: I0321 05:32:00.959650 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567852-gb6qv"] Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.053183 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567852-gb6qv" event={"ID":"d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1","Type":"ContainerStarted","Data":"d836d723ae82d86055fa728c2549d853b7e29c4678af59f29c1c2ecf50bd5917"} Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.053336 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l4gg7" podUID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerName="registry-server" containerID="cri-o://e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2" gracePeriod=2 Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.518501 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.625169 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-catalog-content\") pod \"06965c4c-a775-46ef-ac7b-9638ec75c419\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.625733 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-utilities\") pod \"06965c4c-a775-46ef-ac7b-9638ec75c419\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.625854 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwf7c\" (UniqueName: \"kubernetes.io/projected/06965c4c-a775-46ef-ac7b-9638ec75c419-kube-api-access-xwf7c\") pod \"06965c4c-a775-46ef-ac7b-9638ec75c419\" (UID: \"06965c4c-a775-46ef-ac7b-9638ec75c419\") " Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.627038 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-utilities" (OuterVolumeSpecName: "utilities") pod "06965c4c-a775-46ef-ac7b-9638ec75c419" (UID: "06965c4c-a775-46ef-ac7b-9638ec75c419"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.636970 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06965c4c-a775-46ef-ac7b-9638ec75c419-kube-api-access-xwf7c" (OuterVolumeSpecName: "kube-api-access-xwf7c") pod "06965c4c-a775-46ef-ac7b-9638ec75c419" (UID: "06965c4c-a775-46ef-ac7b-9638ec75c419"). InnerVolumeSpecName "kube-api-access-xwf7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.687859 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06965c4c-a775-46ef-ac7b-9638ec75c419" (UID: "06965c4c-a775-46ef-ac7b-9638ec75c419"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.728286 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.728324 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06965c4c-a775-46ef-ac7b-9638ec75c419-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:32:01 crc kubenswrapper[4839]: I0321 05:32:01.728336 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwf7c\" (UniqueName: \"kubernetes.io/projected/06965c4c-a775-46ef-ac7b-9638ec75c419-kube-api-access-xwf7c\") on node \"crc\" DevicePath \"\"" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.068763 4839 generic.go:334] "Generic (PLEG): container finished" podID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerID="e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2" exitCode=0 Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.068850 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4gg7" event={"ID":"06965c4c-a775-46ef-ac7b-9638ec75c419","Type":"ContainerDied","Data":"e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2"} Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.068915 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4gg7" event={"ID":"06965c4c-a775-46ef-ac7b-9638ec75c419","Type":"ContainerDied","Data":"6d21bf3853d86faa6fa835635204d84060fc0472abf70b72de3c10254b58b0e9"} Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.068939 4839 scope.go:117] "RemoveContainer" containerID="e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.068877 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4gg7" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.096887 4839 scope.go:117] "RemoveContainer" containerID="82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.128995 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l4gg7"] Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.136160 4839 scope.go:117] "RemoveContainer" containerID="0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.165000 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l4gg7"] Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.199944 4839 scope.go:117] "RemoveContainer" containerID="e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2" Mar 21 05:32:02 crc kubenswrapper[4839]: E0321 05:32:02.201285 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2\": container with ID starting with e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2 not found: ID does not exist" containerID="e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.201391 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2"} err="failed to get container status \"e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2\": rpc error: code = NotFound desc = could not find container \"e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2\": container with ID starting with e9baf04dadd9c1bbef6e24a2bef358d8faeec1a054d8e47187cf9a2b97bb6dd2 not found: ID does not exist" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.201432 4839 scope.go:117] "RemoveContainer" containerID="82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8" Mar 21 05:32:02 crc kubenswrapper[4839]: E0321 05:32:02.202116 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8\": container with ID starting with 82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8 not found: ID does not exist" containerID="82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.202149 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8"} err="failed to get container status \"82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8\": rpc error: code = NotFound desc = could not find container \"82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8\": container with ID starting with 82dc51bfb265289e4e99906f3b27dc184e7f48898b7bb203114fd4f5fcc512d8 not found: ID does not exist" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.202164 4839 scope.go:117] "RemoveContainer" containerID="0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8" Mar 21 05:32:02 crc kubenswrapper[4839]: E0321 05:32:02.203917 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8\": container with ID starting with 0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8 not found: ID does not exist" containerID="0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.203943 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8"} err="failed to get container status \"0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8\": rpc error: code = NotFound desc = could not find container \"0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8\": container with ID starting with 0360c3cec9736618095e5477a17f0a4efb323af668769d345433a0f1699f1fd8 not found: ID does not exist" Mar 21 05:32:02 crc kubenswrapper[4839]: I0321 05:32:02.462880 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06965c4c-a775-46ef-ac7b-9638ec75c419" path="/var/lib/kubelet/pods/06965c4c-a775-46ef-ac7b-9638ec75c419/volumes" Mar 21 05:32:03 crc kubenswrapper[4839]: I0321 05:32:03.079184 4839 generic.go:334] "Generic (PLEG): container finished" podID="d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1" containerID="cc3802ac333d73f4abb16330d261760555d938cdc36d0050dadf5466674b13ba" exitCode=0 Mar 21 05:32:03 crc kubenswrapper[4839]: I0321 05:32:03.080368 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567852-gb6qv" event={"ID":"d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1","Type":"ContainerDied","Data":"cc3802ac333d73f4abb16330d261760555d938cdc36d0050dadf5466674b13ba"} Mar 21 05:32:04 crc kubenswrapper[4839]: I0321 05:32:04.436495 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567852-gb6qv" Mar 21 05:32:04 crc kubenswrapper[4839]: I0321 05:32:04.508304 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p56ld\" (UniqueName: \"kubernetes.io/projected/d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1-kube-api-access-p56ld\") pod \"d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1\" (UID: \"d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1\") " Mar 21 05:32:04 crc kubenswrapper[4839]: I0321 05:32:04.515793 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1-kube-api-access-p56ld" (OuterVolumeSpecName: "kube-api-access-p56ld") pod "d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1" (UID: "d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1"). InnerVolumeSpecName "kube-api-access-p56ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:32:04 crc kubenswrapper[4839]: I0321 05:32:04.611054 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p56ld\" (UniqueName: \"kubernetes.io/projected/d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1-kube-api-access-p56ld\") on node \"crc\" DevicePath \"\"" Mar 21 05:32:05 crc kubenswrapper[4839]: I0321 05:32:05.105950 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567852-gb6qv" event={"ID":"d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1","Type":"ContainerDied","Data":"d836d723ae82d86055fa728c2549d853b7e29c4678af59f29c1c2ecf50bd5917"} Mar 21 05:32:05 crc kubenswrapper[4839]: I0321 05:32:05.106004 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567852-gb6qv" Mar 21 05:32:05 crc kubenswrapper[4839]: I0321 05:32:05.107849 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d836d723ae82d86055fa728c2549d853b7e29c4678af59f29c1c2ecf50bd5917" Mar 21 05:32:05 crc kubenswrapper[4839]: I0321 05:32:05.498574 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567846-8dwrk"] Mar 21 05:32:05 crc kubenswrapper[4839]: I0321 05:32:05.507848 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567846-8dwrk"] Mar 21 05:32:06 crc kubenswrapper[4839]: I0321 05:32:06.466275 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="274043bb-38cf-435f-9cb1-01d194d34325" path="/var/lib/kubelet/pods/274043bb-38cf-435f-9cb1-01d194d34325/volumes" Mar 21 05:32:07 crc kubenswrapper[4839]: I0321 05:32:07.452665 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:32:07 crc kubenswrapper[4839]: E0321 05:32:07.453304 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:32:20 crc kubenswrapper[4839]: I0321 05:32:20.453156 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:32:20 crc kubenswrapper[4839]: E0321 05:32:20.453830 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:32:34 crc kubenswrapper[4839]: I0321 05:32:34.453387 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:32:34 crc kubenswrapper[4839]: E0321 05:32:34.454217 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.062104 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ppvlf/must-gather-sjwj7"] Mar 21 05:32:41 crc kubenswrapper[4839]: E0321 05:32:41.063146 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerName="extract-content" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.063163 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerName="extract-content" Mar 21 05:32:41 crc kubenswrapper[4839]: E0321 05:32:41.063184 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerName="registry-server" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.063192 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerName="registry-server" Mar 21 05:32:41 crc kubenswrapper[4839]: E0321 05:32:41.063206 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1" containerName="oc" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.063216 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1" containerName="oc" Mar 21 05:32:41 crc kubenswrapper[4839]: E0321 05:32:41.063232 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerName="extract-utilities" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.063239 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerName="extract-utilities" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.063499 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="06965c4c-a775-46ef-ac7b-9638ec75c419" containerName="registry-server" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.063521 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1" containerName="oc" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.064730 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/must-gather-sjwj7" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.066603 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ppvlf"/"openshift-service-ca.crt" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.066811 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ppvlf"/"kube-root-ca.crt" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.067042 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ppvlf"/"default-dockercfg-xwmcj" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.069886 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ppvlf/must-gather-sjwj7"] Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.123691 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx5vq\" (UniqueName: \"kubernetes.io/projected/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-kube-api-access-rx5vq\") pod \"must-gather-sjwj7\" (UID: \"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d\") " pod="openshift-must-gather-ppvlf/must-gather-sjwj7" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.123951 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-must-gather-output\") pod \"must-gather-sjwj7\" (UID: \"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d\") " pod="openshift-must-gather-ppvlf/must-gather-sjwj7" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.225496 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx5vq\" (UniqueName: \"kubernetes.io/projected/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-kube-api-access-rx5vq\") pod \"must-gather-sjwj7\" (UID: \"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d\") " pod="openshift-must-gather-ppvlf/must-gather-sjwj7" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.225549 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-must-gather-output\") pod \"must-gather-sjwj7\" (UID: \"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d\") " pod="openshift-must-gather-ppvlf/must-gather-sjwj7" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.226094 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-must-gather-output\") pod \"must-gather-sjwj7\" (UID: \"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d\") " pod="openshift-must-gather-ppvlf/must-gather-sjwj7" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.248105 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx5vq\" (UniqueName: \"kubernetes.io/projected/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-kube-api-access-rx5vq\") pod \"must-gather-sjwj7\" (UID: \"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d\") " pod="openshift-must-gather-ppvlf/must-gather-sjwj7" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.383355 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/must-gather-sjwj7" Mar 21 05:32:41 crc kubenswrapper[4839]: I0321 05:32:41.946434 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ppvlf/must-gather-sjwj7"] Mar 21 05:32:42 crc kubenswrapper[4839]: I0321 05:32:42.442985 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/must-gather-sjwj7" event={"ID":"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d","Type":"ContainerStarted","Data":"14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e"} Mar 21 05:32:42 crc kubenswrapper[4839]: I0321 05:32:42.443241 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/must-gather-sjwj7" event={"ID":"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d","Type":"ContainerStarted","Data":"5a207bcd98fea8bdcbd8fcac34144924b54bfe6689991ce19989ac8cd6f7c3fd"} Mar 21 05:32:43 crc kubenswrapper[4839]: I0321 05:32:43.453233 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/must-gather-sjwj7" event={"ID":"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d","Type":"ContainerStarted","Data":"048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8"} Mar 21 05:32:43 crc kubenswrapper[4839]: I0321 05:32:43.472007 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ppvlf/must-gather-sjwj7" podStartSLOduration=2.471988495 podStartE2EDuration="2.471988495s" podCreationTimestamp="2026-03-21 05:32:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:32:43.470334828 +0000 UTC m=+4167.798121504" watchObservedRunningTime="2026-03-21 05:32:43.471988495 +0000 UTC m=+4167.799775171" Mar 21 05:32:45 crc kubenswrapper[4839]: I0321 05:32:45.962899 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ppvlf/crc-debug-jm2cv"] Mar 21 05:32:45 crc kubenswrapper[4839]: I0321 05:32:45.964454 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" Mar 21 05:32:46 crc kubenswrapper[4839]: I0321 05:32:46.032504 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81dc8803-7188-45df-8f25-fe1037bbdd01-host\") pod \"crc-debug-jm2cv\" (UID: \"81dc8803-7188-45df-8f25-fe1037bbdd01\") " pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" Mar 21 05:32:46 crc kubenswrapper[4839]: I0321 05:32:46.032657 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95t7g\" (UniqueName: \"kubernetes.io/projected/81dc8803-7188-45df-8f25-fe1037bbdd01-kube-api-access-95t7g\") pod \"crc-debug-jm2cv\" (UID: \"81dc8803-7188-45df-8f25-fe1037bbdd01\") " pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" Mar 21 05:32:46 crc kubenswrapper[4839]: I0321 05:32:46.133509 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81dc8803-7188-45df-8f25-fe1037bbdd01-host\") pod \"crc-debug-jm2cv\" (UID: \"81dc8803-7188-45df-8f25-fe1037bbdd01\") " pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" Mar 21 05:32:46 crc kubenswrapper[4839]: I0321 05:32:46.133585 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95t7g\" (UniqueName: \"kubernetes.io/projected/81dc8803-7188-45df-8f25-fe1037bbdd01-kube-api-access-95t7g\") pod \"crc-debug-jm2cv\" (UID: \"81dc8803-7188-45df-8f25-fe1037bbdd01\") " pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" Mar 21 05:32:46 crc kubenswrapper[4839]: I0321 05:32:46.133696 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81dc8803-7188-45df-8f25-fe1037bbdd01-host\") pod \"crc-debug-jm2cv\" (UID: \"81dc8803-7188-45df-8f25-fe1037bbdd01\") " pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" Mar 21 05:32:46 crc kubenswrapper[4839]: I0321 05:32:46.166397 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95t7g\" (UniqueName: \"kubernetes.io/projected/81dc8803-7188-45df-8f25-fe1037bbdd01-kube-api-access-95t7g\") pod \"crc-debug-jm2cv\" (UID: \"81dc8803-7188-45df-8f25-fe1037bbdd01\") " pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" Mar 21 05:32:46 crc kubenswrapper[4839]: I0321 05:32:46.289515 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" Mar 21 05:32:46 crc kubenswrapper[4839]: I0321 05:32:46.462362 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:32:46 crc kubenswrapper[4839]: E0321 05:32:46.462650 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:32:46 crc kubenswrapper[4839]: I0321 05:32:46.485330 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" event={"ID":"81dc8803-7188-45df-8f25-fe1037bbdd01","Type":"ContainerStarted","Data":"1430886b299af015100e7533b1000c0e59127cd85c1994b254190e65fcf9e655"} Mar 21 05:32:47 crc kubenswrapper[4839]: I0321 05:32:47.498643 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" event={"ID":"81dc8803-7188-45df-8f25-fe1037bbdd01","Type":"ContainerStarted","Data":"e0aaa7c76a0ee9b1660ca2e309fd9d60f43c9f5876dc19d939b4dd884d137805"} Mar 21 05:32:47 crc kubenswrapper[4839]: I0321 05:32:47.527293 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" podStartSLOduration=2.527275901 podStartE2EDuration="2.527275901s" podCreationTimestamp="2026-03-21 05:32:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:32:47.523917806 +0000 UTC m=+4171.851704482" watchObservedRunningTime="2026-03-21 05:32:47.527275901 +0000 UTC m=+4171.855062577" Mar 21 05:32:58 crc kubenswrapper[4839]: I0321 05:32:58.209411 4839 scope.go:117] "RemoveContainer" containerID="3e713d4f3a2eccb8fba4adfa096046056f6cf5d095f6cdc7fc919eb1fb945456" Mar 21 05:33:00 crc kubenswrapper[4839]: I0321 05:33:00.453445 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:33:00 crc kubenswrapper[4839]: E0321 05:33:00.453984 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:33:11 crc kubenswrapper[4839]: I0321 05:33:11.452969 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:33:11 crc kubenswrapper[4839]: E0321 05:33:11.454013 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:33:21 crc kubenswrapper[4839]: I0321 05:33:21.782103 4839 generic.go:334] "Generic (PLEG): container finished" podID="81dc8803-7188-45df-8f25-fe1037bbdd01" containerID="e0aaa7c76a0ee9b1660ca2e309fd9d60f43c9f5876dc19d939b4dd884d137805" exitCode=0 Mar 21 05:33:21 crc kubenswrapper[4839]: I0321 05:33:21.782179 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" event={"ID":"81dc8803-7188-45df-8f25-fe1037bbdd01","Type":"ContainerDied","Data":"e0aaa7c76a0ee9b1660ca2e309fd9d60f43c9f5876dc19d939b4dd884d137805"} Mar 21 05:33:22 crc kubenswrapper[4839]: I0321 05:33:22.453279 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:33:22 crc kubenswrapper[4839]: E0321 05:33:22.453750 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:33:22 crc kubenswrapper[4839]: I0321 05:33:22.888902 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" Mar 21 05:33:22 crc kubenswrapper[4839]: I0321 05:33:22.924819 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ppvlf/crc-debug-jm2cv"] Mar 21 05:33:22 crc kubenswrapper[4839]: I0321 05:33:22.934514 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ppvlf/crc-debug-jm2cv"] Mar 21 05:33:23 crc kubenswrapper[4839]: I0321 05:33:23.057041 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81dc8803-7188-45df-8f25-fe1037bbdd01-host\") pod \"81dc8803-7188-45df-8f25-fe1037bbdd01\" (UID: \"81dc8803-7188-45df-8f25-fe1037bbdd01\") " Mar 21 05:33:23 crc kubenswrapper[4839]: I0321 05:33:23.057351 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95t7g\" (UniqueName: \"kubernetes.io/projected/81dc8803-7188-45df-8f25-fe1037bbdd01-kube-api-access-95t7g\") pod \"81dc8803-7188-45df-8f25-fe1037bbdd01\" (UID: \"81dc8803-7188-45df-8f25-fe1037bbdd01\") " Mar 21 05:33:23 crc kubenswrapper[4839]: I0321 05:33:23.057362 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81dc8803-7188-45df-8f25-fe1037bbdd01-host" (OuterVolumeSpecName: "host") pod "81dc8803-7188-45df-8f25-fe1037bbdd01" (UID: "81dc8803-7188-45df-8f25-fe1037bbdd01"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:33:23 crc kubenswrapper[4839]: I0321 05:33:23.058051 4839 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81dc8803-7188-45df-8f25-fe1037bbdd01-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:33:23 crc kubenswrapper[4839]: I0321 05:33:23.069256 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81dc8803-7188-45df-8f25-fe1037bbdd01-kube-api-access-95t7g" (OuterVolumeSpecName: "kube-api-access-95t7g") pod "81dc8803-7188-45df-8f25-fe1037bbdd01" (UID: "81dc8803-7188-45df-8f25-fe1037bbdd01"). InnerVolumeSpecName "kube-api-access-95t7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:33:23 crc kubenswrapper[4839]: I0321 05:33:23.159987 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95t7g\" (UniqueName: \"kubernetes.io/projected/81dc8803-7188-45df-8f25-fe1037bbdd01-kube-api-access-95t7g\") on node \"crc\" DevicePath \"\"" Mar 21 05:33:23 crc kubenswrapper[4839]: I0321 05:33:23.800540 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1430886b299af015100e7533b1000c0e59127cd85c1994b254190e65fcf9e655" Mar 21 05:33:23 crc kubenswrapper[4839]: I0321 05:33:23.800614 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-jm2cv" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.122485 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ppvlf/crc-debug-qnkbb"] Mar 21 05:33:24 crc kubenswrapper[4839]: E0321 05:33:24.123005 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81dc8803-7188-45df-8f25-fe1037bbdd01" containerName="container-00" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.123028 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="81dc8803-7188-45df-8f25-fe1037bbdd01" containerName="container-00" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.123305 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="81dc8803-7188-45df-8f25-fe1037bbdd01" containerName="container-00" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.124090 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.278507 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9def19c-1036-49fc-874e-aa1013e5c547-host\") pod \"crc-debug-qnkbb\" (UID: \"c9def19c-1036-49fc-874e-aa1013e5c547\") " pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.278580 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmn5k\" (UniqueName: \"kubernetes.io/projected/c9def19c-1036-49fc-874e-aa1013e5c547-kube-api-access-jmn5k\") pod \"crc-debug-qnkbb\" (UID: \"c9def19c-1036-49fc-874e-aa1013e5c547\") " pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.380867 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmn5k\" (UniqueName: \"kubernetes.io/projected/c9def19c-1036-49fc-874e-aa1013e5c547-kube-api-access-jmn5k\") pod \"crc-debug-qnkbb\" (UID: \"c9def19c-1036-49fc-874e-aa1013e5c547\") " pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.381100 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9def19c-1036-49fc-874e-aa1013e5c547-host\") pod \"crc-debug-qnkbb\" (UID: \"c9def19c-1036-49fc-874e-aa1013e5c547\") " pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.381227 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9def19c-1036-49fc-874e-aa1013e5c547-host\") pod \"crc-debug-qnkbb\" (UID: \"c9def19c-1036-49fc-874e-aa1013e5c547\") " pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.403469 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmn5k\" (UniqueName: \"kubernetes.io/projected/c9def19c-1036-49fc-874e-aa1013e5c547-kube-api-access-jmn5k\") pod \"crc-debug-qnkbb\" (UID: \"c9def19c-1036-49fc-874e-aa1013e5c547\") " pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.441265 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.466162 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81dc8803-7188-45df-8f25-fe1037bbdd01" path="/var/lib/kubelet/pods/81dc8803-7188-45df-8f25-fe1037bbdd01/volumes" Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.810540 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" event={"ID":"c9def19c-1036-49fc-874e-aa1013e5c547","Type":"ContainerStarted","Data":"d991b608c7dd15cd8e8f6e12d6073ad24091724986f4f1fa631390572cd83d55"} Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.811181 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" event={"ID":"c9def19c-1036-49fc-874e-aa1013e5c547","Type":"ContainerStarted","Data":"55701753f649ce649fcb45fa7eda4471b54715543775e82e35dbbd0d3456ffd0"} Mar 21 05:33:24 crc kubenswrapper[4839]: I0321 05:33:24.825723 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" podStartSLOduration=0.825704086 podStartE2EDuration="825.704086ms" podCreationTimestamp="2026-03-21 05:33:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-21 05:33:24.824136052 +0000 UTC m=+4209.151922738" watchObservedRunningTime="2026-03-21 05:33:24.825704086 +0000 UTC m=+4209.153490762" Mar 21 05:33:25 crc kubenswrapper[4839]: I0321 05:33:25.830171 4839 generic.go:334] "Generic (PLEG): container finished" podID="c9def19c-1036-49fc-874e-aa1013e5c547" containerID="d991b608c7dd15cd8e8f6e12d6073ad24091724986f4f1fa631390572cd83d55" exitCode=0 Mar 21 05:33:25 crc kubenswrapper[4839]: I0321 05:33:25.830524 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" event={"ID":"c9def19c-1036-49fc-874e-aa1013e5c547","Type":"ContainerDied","Data":"d991b608c7dd15cd8e8f6e12d6073ad24091724986f4f1fa631390572cd83d55"} Mar 21 05:33:26 crc kubenswrapper[4839]: I0321 05:33:26.938841 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" Mar 21 05:33:26 crc kubenswrapper[4839]: I0321 05:33:26.972938 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ppvlf/crc-debug-qnkbb"] Mar 21 05:33:26 crc kubenswrapper[4839]: I0321 05:33:26.982430 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ppvlf/crc-debug-qnkbb"] Mar 21 05:33:27 crc kubenswrapper[4839]: I0321 05:33:27.026538 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmn5k\" (UniqueName: \"kubernetes.io/projected/c9def19c-1036-49fc-874e-aa1013e5c547-kube-api-access-jmn5k\") pod \"c9def19c-1036-49fc-874e-aa1013e5c547\" (UID: \"c9def19c-1036-49fc-874e-aa1013e5c547\") " Mar 21 05:33:27 crc kubenswrapper[4839]: I0321 05:33:27.026868 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9def19c-1036-49fc-874e-aa1013e5c547-host\") pod \"c9def19c-1036-49fc-874e-aa1013e5c547\" (UID: \"c9def19c-1036-49fc-874e-aa1013e5c547\") " Mar 21 05:33:27 crc kubenswrapper[4839]: I0321 05:33:27.026990 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9def19c-1036-49fc-874e-aa1013e5c547-host" (OuterVolumeSpecName: "host") pod "c9def19c-1036-49fc-874e-aa1013e5c547" (UID: "c9def19c-1036-49fc-874e-aa1013e5c547"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:33:27 crc kubenswrapper[4839]: I0321 05:33:27.027386 4839 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c9def19c-1036-49fc-874e-aa1013e5c547-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:33:27 crc kubenswrapper[4839]: I0321 05:33:27.032627 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9def19c-1036-49fc-874e-aa1013e5c547-kube-api-access-jmn5k" (OuterVolumeSpecName: "kube-api-access-jmn5k") pod "c9def19c-1036-49fc-874e-aa1013e5c547" (UID: "c9def19c-1036-49fc-874e-aa1013e5c547"). InnerVolumeSpecName "kube-api-access-jmn5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:33:27 crc kubenswrapper[4839]: I0321 05:33:27.128995 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmn5k\" (UniqueName: \"kubernetes.io/projected/c9def19c-1036-49fc-874e-aa1013e5c547-kube-api-access-jmn5k\") on node \"crc\" DevicePath \"\"" Mar 21 05:33:27 crc kubenswrapper[4839]: I0321 05:33:27.846084 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55701753f649ce649fcb45fa7eda4471b54715543775e82e35dbbd0d3456ffd0" Mar 21 05:33:27 crc kubenswrapper[4839]: I0321 05:33:27.846156 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-qnkbb" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.181145 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ppvlf/crc-debug-4bzpc"] Mar 21 05:33:28 crc kubenswrapper[4839]: E0321 05:33:28.181767 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9def19c-1036-49fc-874e-aa1013e5c547" containerName="container-00" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.181786 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9def19c-1036-49fc-874e-aa1013e5c547" containerName="container-00" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.182023 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9def19c-1036-49fc-874e-aa1013e5c547" containerName="container-00" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.182793 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.350860 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drbw4\" (UniqueName: \"kubernetes.io/projected/9a65d87c-e91f-4d4d-846d-16c7699da843-kube-api-access-drbw4\") pod \"crc-debug-4bzpc\" (UID: \"9a65d87c-e91f-4d4d-846d-16c7699da843\") " pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.350986 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a65d87c-e91f-4d4d-846d-16c7699da843-host\") pod \"crc-debug-4bzpc\" (UID: \"9a65d87c-e91f-4d4d-846d-16c7699da843\") " pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.453164 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a65d87c-e91f-4d4d-846d-16c7699da843-host\") pod \"crc-debug-4bzpc\" (UID: \"9a65d87c-e91f-4d4d-846d-16c7699da843\") " pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.453303 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a65d87c-e91f-4d4d-846d-16c7699da843-host\") pod \"crc-debug-4bzpc\" (UID: \"9a65d87c-e91f-4d4d-846d-16c7699da843\") " pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.453317 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drbw4\" (UniqueName: \"kubernetes.io/projected/9a65d87c-e91f-4d4d-846d-16c7699da843-kube-api-access-drbw4\") pod \"crc-debug-4bzpc\" (UID: \"9a65d87c-e91f-4d4d-846d-16c7699da843\") " pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.464915 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9def19c-1036-49fc-874e-aa1013e5c547" path="/var/lib/kubelet/pods/c9def19c-1036-49fc-874e-aa1013e5c547/volumes" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.475326 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drbw4\" (UniqueName: \"kubernetes.io/projected/9a65d87c-e91f-4d4d-846d-16c7699da843-kube-api-access-drbw4\") pod \"crc-debug-4bzpc\" (UID: \"9a65d87c-e91f-4d4d-846d-16c7699da843\") " pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.500947 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" Mar 21 05:33:28 crc kubenswrapper[4839]: W0321 05:33:28.526238 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a65d87c_e91f_4d4d_846d_16c7699da843.slice/crio-f57d1f21fdbdd1b9a60363a5606e7fdd0f95473ba15321b34545ab027e9b215e WatchSource:0}: Error finding container f57d1f21fdbdd1b9a60363a5606e7fdd0f95473ba15321b34545ab027e9b215e: Status 404 returned error can't find the container with id f57d1f21fdbdd1b9a60363a5606e7fdd0f95473ba15321b34545ab027e9b215e Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.856520 4839 generic.go:334] "Generic (PLEG): container finished" podID="9a65d87c-e91f-4d4d-846d-16c7699da843" containerID="1d47288112350ee1777065d7e7d1470b54937a865dd7e52b77f16a09cf7130e3" exitCode=0 Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.856605 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" event={"ID":"9a65d87c-e91f-4d4d-846d-16c7699da843","Type":"ContainerDied","Data":"1d47288112350ee1777065d7e7d1470b54937a865dd7e52b77f16a09cf7130e3"} Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.856866 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" event={"ID":"9a65d87c-e91f-4d4d-846d-16c7699da843","Type":"ContainerStarted","Data":"f57d1f21fdbdd1b9a60363a5606e7fdd0f95473ba15321b34545ab027e9b215e"} Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.902758 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ppvlf/crc-debug-4bzpc"] Mar 21 05:33:28 crc kubenswrapper[4839]: I0321 05:33:28.911884 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ppvlf/crc-debug-4bzpc"] Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.238200 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2fdn9"] Mar 21 05:33:29 crc kubenswrapper[4839]: E0321 05:33:29.239526 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a65d87c-e91f-4d4d-846d-16c7699da843" containerName="container-00" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.239624 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a65d87c-e91f-4d4d-846d-16c7699da843" containerName="container-00" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.239921 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a65d87c-e91f-4d4d-846d-16c7699da843" containerName="container-00" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.241937 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.250975 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fdn9"] Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.368746 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-utilities\") pod \"redhat-marketplace-2fdn9\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.368933 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s225m\" (UniqueName: \"kubernetes.io/projected/319559c6-c34d-4b00-ba90-8fcd5b5ff425-kube-api-access-s225m\") pod \"redhat-marketplace-2fdn9\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.368985 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-catalog-content\") pod \"redhat-marketplace-2fdn9\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.470804 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s225m\" (UniqueName: \"kubernetes.io/projected/319559c6-c34d-4b00-ba90-8fcd5b5ff425-kube-api-access-s225m\") pod \"redhat-marketplace-2fdn9\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.470897 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-catalog-content\") pod \"redhat-marketplace-2fdn9\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.471119 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-utilities\") pod \"redhat-marketplace-2fdn9\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.471849 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-catalog-content\") pod \"redhat-marketplace-2fdn9\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.471944 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-utilities\") pod \"redhat-marketplace-2fdn9\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.492543 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s225m\" (UniqueName: \"kubernetes.io/projected/319559c6-c34d-4b00-ba90-8fcd5b5ff425-kube-api-access-s225m\") pod \"redhat-marketplace-2fdn9\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:29 crc kubenswrapper[4839]: I0321 05:33:29.559634 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.058057 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.182505 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a65d87c-e91f-4d4d-846d-16c7699da843-host\") pod \"9a65d87c-e91f-4d4d-846d-16c7699da843\" (UID: \"9a65d87c-e91f-4d4d-846d-16c7699da843\") " Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.182763 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drbw4\" (UniqueName: \"kubernetes.io/projected/9a65d87c-e91f-4d4d-846d-16c7699da843-kube-api-access-drbw4\") pod \"9a65d87c-e91f-4d4d-846d-16c7699da843\" (UID: \"9a65d87c-e91f-4d4d-846d-16c7699da843\") " Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.183685 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a65d87c-e91f-4d4d-846d-16c7699da843-host" (OuterVolumeSpecName: "host") pod "9a65d87c-e91f-4d4d-846d-16c7699da843" (UID: "9a65d87c-e91f-4d4d-846d-16c7699da843"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.188292 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a65d87c-e91f-4d4d-846d-16c7699da843-kube-api-access-drbw4" (OuterVolumeSpecName: "kube-api-access-drbw4") pod "9a65d87c-e91f-4d4d-846d-16c7699da843" (UID: "9a65d87c-e91f-4d4d-846d-16c7699da843"). InnerVolumeSpecName "kube-api-access-drbw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.284834 4839 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a65d87c-e91f-4d4d-846d-16c7699da843-host\") on node \"crc\" DevicePath \"\"" Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.285976 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drbw4\" (UniqueName: \"kubernetes.io/projected/9a65d87c-e91f-4d4d-846d-16c7699da843-kube-api-access-drbw4\") on node \"crc\" DevicePath \"\"" Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.411304 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fdn9"] Mar 21 05:33:30 crc kubenswrapper[4839]: W0321 05:33:30.416685 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod319559c6_c34d_4b00_ba90_8fcd5b5ff425.slice/crio-bbdb617a70ae3560ec2109cc611ceab71b54198665cd34c53ff0b89b1320b6d8 WatchSource:0}: Error finding container bbdb617a70ae3560ec2109cc611ceab71b54198665cd34c53ff0b89b1320b6d8: Status 404 returned error can't find the container with id bbdb617a70ae3560ec2109cc611ceab71b54198665cd34c53ff0b89b1320b6d8 Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.468451 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a65d87c-e91f-4d4d-846d-16c7699da843" path="/var/lib/kubelet/pods/9a65d87c-e91f-4d4d-846d-16c7699da843/volumes" Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.875103 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/crc-debug-4bzpc" Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.875126 4839 scope.go:117] "RemoveContainer" containerID="1d47288112350ee1777065d7e7d1470b54937a865dd7e52b77f16a09cf7130e3" Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.877433 4839 generic.go:334] "Generic (PLEG): container finished" podID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerID="f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4" exitCode=0 Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.877465 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fdn9" event={"ID":"319559c6-c34d-4b00-ba90-8fcd5b5ff425","Type":"ContainerDied","Data":"f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4"} Mar 21 05:33:30 crc kubenswrapper[4839]: I0321 05:33:30.877487 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fdn9" event={"ID":"319559c6-c34d-4b00-ba90-8fcd5b5ff425","Type":"ContainerStarted","Data":"bbdb617a70ae3560ec2109cc611ceab71b54198665cd34c53ff0b89b1320b6d8"} Mar 21 05:33:32 crc kubenswrapper[4839]: I0321 05:33:32.897768 4839 generic.go:334] "Generic (PLEG): container finished" podID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerID="08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39" exitCode=0 Mar 21 05:33:32 crc kubenswrapper[4839]: I0321 05:33:32.897844 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fdn9" event={"ID":"319559c6-c34d-4b00-ba90-8fcd5b5ff425","Type":"ContainerDied","Data":"08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39"} Mar 21 05:33:33 crc kubenswrapper[4839]: I0321 05:33:33.452993 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:33:33 crc kubenswrapper[4839]: E0321 05:33:33.453313 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:33:33 crc kubenswrapper[4839]: I0321 05:33:33.909098 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fdn9" event={"ID":"319559c6-c34d-4b00-ba90-8fcd5b5ff425","Type":"ContainerStarted","Data":"e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0"} Mar 21 05:33:33 crc kubenswrapper[4839]: I0321 05:33:33.932451 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2fdn9" podStartSLOduration=2.499160751 podStartE2EDuration="4.932424921s" podCreationTimestamp="2026-03-21 05:33:29 +0000 UTC" firstStartedPulling="2026-03-21 05:33:30.879248497 +0000 UTC m=+4215.207035173" lastFinishedPulling="2026-03-21 05:33:33.312512657 +0000 UTC m=+4217.640299343" observedRunningTime="2026-03-21 05:33:33.92669451 +0000 UTC m=+4218.254481206" watchObservedRunningTime="2026-03-21 05:33:33.932424921 +0000 UTC m=+4218.260211597" Mar 21 05:33:39 crc kubenswrapper[4839]: I0321 05:33:39.560741 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:39 crc kubenswrapper[4839]: I0321 05:33:39.561402 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:39 crc kubenswrapper[4839]: I0321 05:33:39.923358 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:39 crc kubenswrapper[4839]: I0321 05:33:39.998490 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:40 crc kubenswrapper[4839]: I0321 05:33:40.170177 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fdn9"] Mar 21 05:33:41 crc kubenswrapper[4839]: I0321 05:33:41.973206 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2fdn9" podUID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerName="registry-server" containerID="cri-o://e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0" gracePeriod=2 Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.463559 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.548623 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s225m\" (UniqueName: \"kubernetes.io/projected/319559c6-c34d-4b00-ba90-8fcd5b5ff425-kube-api-access-s225m\") pod \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.548723 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-utilities\") pod \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.548774 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-catalog-content\") pod \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\" (UID: \"319559c6-c34d-4b00-ba90-8fcd5b5ff425\") " Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.549627 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-utilities" (OuterVolumeSpecName: "utilities") pod "319559c6-c34d-4b00-ba90-8fcd5b5ff425" (UID: "319559c6-c34d-4b00-ba90-8fcd5b5ff425"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.565793 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/319559c6-c34d-4b00-ba90-8fcd5b5ff425-kube-api-access-s225m" (OuterVolumeSpecName: "kube-api-access-s225m") pod "319559c6-c34d-4b00-ba90-8fcd5b5ff425" (UID: "319559c6-c34d-4b00-ba90-8fcd5b5ff425"). InnerVolumeSpecName "kube-api-access-s225m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.583624 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "319559c6-c34d-4b00-ba90-8fcd5b5ff425" (UID: "319559c6-c34d-4b00-ba90-8fcd5b5ff425"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.651272 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.651323 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/319559c6-c34d-4b00-ba90-8fcd5b5ff425-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.651338 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s225m\" (UniqueName: \"kubernetes.io/projected/319559c6-c34d-4b00-ba90-8fcd5b5ff425-kube-api-access-s225m\") on node \"crc\" DevicePath \"\"" Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.985919 4839 generic.go:334] "Generic (PLEG): container finished" podID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerID="e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0" exitCode=0 Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.985961 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fdn9" event={"ID":"319559c6-c34d-4b00-ba90-8fcd5b5ff425","Type":"ContainerDied","Data":"e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0"} Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.985989 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2fdn9" event={"ID":"319559c6-c34d-4b00-ba90-8fcd5b5ff425","Type":"ContainerDied","Data":"bbdb617a70ae3560ec2109cc611ceab71b54198665cd34c53ff0b89b1320b6d8"} Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.986007 4839 scope.go:117] "RemoveContainer" containerID="e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0" Mar 21 05:33:42 crc kubenswrapper[4839]: I0321 05:33:42.986161 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2fdn9" Mar 21 05:33:43 crc kubenswrapper[4839]: I0321 05:33:43.006913 4839 scope.go:117] "RemoveContainer" containerID="08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39" Mar 21 05:33:43 crc kubenswrapper[4839]: I0321 05:33:43.033699 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fdn9"] Mar 21 05:33:43 crc kubenswrapper[4839]: I0321 05:33:43.041744 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2fdn9"] Mar 21 05:33:43 crc kubenswrapper[4839]: I0321 05:33:43.049717 4839 scope.go:117] "RemoveContainer" containerID="f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4" Mar 21 05:33:43 crc kubenswrapper[4839]: I0321 05:33:43.097079 4839 scope.go:117] "RemoveContainer" containerID="e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0" Mar 21 05:33:43 crc kubenswrapper[4839]: E0321 05:33:43.097885 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0\": container with ID starting with e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0 not found: ID does not exist" containerID="e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0" Mar 21 05:33:43 crc kubenswrapper[4839]: I0321 05:33:43.097928 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0"} err="failed to get container status \"e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0\": rpc error: code = NotFound desc = could not find container \"e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0\": container with ID starting with e2003cc5eef8937b4d5e6d01a9f312da1bdde7a4f93033f9bbd72be3a73d9ea0 not found: ID does not exist" Mar 21 05:33:43 crc kubenswrapper[4839]: I0321 05:33:43.097953 4839 scope.go:117] "RemoveContainer" containerID="08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39" Mar 21 05:33:43 crc kubenswrapper[4839]: E0321 05:33:43.098980 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39\": container with ID starting with 08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39 not found: ID does not exist" containerID="08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39" Mar 21 05:33:43 crc kubenswrapper[4839]: I0321 05:33:43.099058 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39"} err="failed to get container status \"08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39\": rpc error: code = NotFound desc = could not find container \"08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39\": container with ID starting with 08fe7a9ab4632db40330db26ea460fb7563a01b7cc604af38618f2de6a05bf39 not found: ID does not exist" Mar 21 05:33:43 crc kubenswrapper[4839]: I0321 05:33:43.099099 4839 scope.go:117] "RemoveContainer" containerID="f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4" Mar 21 05:33:43 crc kubenswrapper[4839]: E0321 05:33:43.099547 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4\": container with ID starting with f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4 not found: ID does not exist" containerID="f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4" Mar 21 05:33:43 crc kubenswrapper[4839]: I0321 05:33:43.099633 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4"} err="failed to get container status \"f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4\": rpc error: code = NotFound desc = could not find container \"f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4\": container with ID starting with f2b597e1fa2c74bb598dec6011739de388be4c84a928103a06aadbe11ea920c4 not found: ID does not exist" Mar 21 05:33:44 crc kubenswrapper[4839]: I0321 05:33:44.464684 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" path="/var/lib/kubelet/pods/319559c6-c34d-4b00-ba90-8fcd5b5ff425/volumes" Mar 21 05:33:48 crc kubenswrapper[4839]: I0321 05:33:48.453626 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:33:48 crc kubenswrapper[4839]: E0321 05:33:48.454866 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.152685 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567854-85fvh"] Mar 21 05:34:00 crc kubenswrapper[4839]: E0321 05:34:00.153635 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerName="registry-server" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.153649 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerName="registry-server" Mar 21 05:34:00 crc kubenswrapper[4839]: E0321 05:34:00.153664 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerName="extract-utilities" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.153670 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerName="extract-utilities" Mar 21 05:34:00 crc kubenswrapper[4839]: E0321 05:34:00.153687 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerName="extract-content" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.153693 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerName="extract-content" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.153876 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="319559c6-c34d-4b00-ba90-8fcd5b5ff425" containerName="registry-server" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.154532 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567854-85fvh" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.158699 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.158959 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.159170 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.167200 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567854-85fvh"] Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.290302 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbrks\" (UniqueName: \"kubernetes.io/projected/bec1d36d-4ff4-4f29-9d04-59f088e00f09-kube-api-access-cbrks\") pod \"auto-csr-approver-29567854-85fvh\" (UID: \"bec1d36d-4ff4-4f29-9d04-59f088e00f09\") " pod="openshift-infra/auto-csr-approver-29567854-85fvh" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.392795 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbrks\" (UniqueName: \"kubernetes.io/projected/bec1d36d-4ff4-4f29-9d04-59f088e00f09-kube-api-access-cbrks\") pod \"auto-csr-approver-29567854-85fvh\" (UID: \"bec1d36d-4ff4-4f29-9d04-59f088e00f09\") " pod="openshift-infra/auto-csr-approver-29567854-85fvh" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.413563 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbrks\" (UniqueName: \"kubernetes.io/projected/bec1d36d-4ff4-4f29-9d04-59f088e00f09-kube-api-access-cbrks\") pod \"auto-csr-approver-29567854-85fvh\" (UID: \"bec1d36d-4ff4-4f29-9d04-59f088e00f09\") " pod="openshift-infra/auto-csr-approver-29567854-85fvh" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.482965 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567854-85fvh" Mar 21 05:34:00 crc kubenswrapper[4839]: I0321 05:34:00.917755 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567854-85fvh"] Mar 21 05:34:01 crc kubenswrapper[4839]: I0321 05:34:01.158855 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567854-85fvh" event={"ID":"bec1d36d-4ff4-4f29-9d04-59f088e00f09","Type":"ContainerStarted","Data":"31223f38f218f9eee4ae54c62fd90b1f022c42abbeb7d89f51483d3b58494628"} Mar 21 05:34:02 crc kubenswrapper[4839]: I0321 05:34:02.468226 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:34:03 crc kubenswrapper[4839]: I0321 05:34:03.178996 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"c9fc249e5d17a6c2fdbc1eaec440716c42661d1c2e7c8e6b17923104003e02fe"} Mar 21 05:34:03 crc kubenswrapper[4839]: I0321 05:34:03.184373 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567854-85fvh" event={"ID":"bec1d36d-4ff4-4f29-9d04-59f088e00f09","Type":"ContainerStarted","Data":"1b773a94d9de7762b645d818a8305a6aa83ff1f49522be66070dc127da6682d7"} Mar 21 05:34:03 crc kubenswrapper[4839]: I0321 05:34:03.236150 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567854-85fvh" podStartSLOduration=2.172201956 podStartE2EDuration="3.236131172s" podCreationTimestamp="2026-03-21 05:34:00 +0000 UTC" firstStartedPulling="2026-03-21 05:34:00.925204983 +0000 UTC m=+4245.252991659" lastFinishedPulling="2026-03-21 05:34:01.989134199 +0000 UTC m=+4246.316920875" observedRunningTime="2026-03-21 05:34:03.223458414 +0000 UTC m=+4247.551245080" watchObservedRunningTime="2026-03-21 05:34:03.236131172 +0000 UTC m=+4247.563917848" Mar 21 05:34:04 crc kubenswrapper[4839]: I0321 05:34:04.195552 4839 generic.go:334] "Generic (PLEG): container finished" podID="bec1d36d-4ff4-4f29-9d04-59f088e00f09" containerID="1b773a94d9de7762b645d818a8305a6aa83ff1f49522be66070dc127da6682d7" exitCode=0 Mar 21 05:34:04 crc kubenswrapper[4839]: I0321 05:34:04.195687 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567854-85fvh" event={"ID":"bec1d36d-4ff4-4f29-9d04-59f088e00f09","Type":"ContainerDied","Data":"1b773a94d9de7762b645d818a8305a6aa83ff1f49522be66070dc127da6682d7"} Mar 21 05:34:05 crc kubenswrapper[4839]: I0321 05:34:05.582863 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567854-85fvh" Mar 21 05:34:05 crc kubenswrapper[4839]: I0321 05:34:05.612725 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-d9cf4c794-jb7lf_37ba14c5-dfc7-4268-86c9-c0efe37fe6c9/barbican-api/0.log" Mar 21 05:34:05 crc kubenswrapper[4839]: I0321 05:34:05.690311 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbrks\" (UniqueName: \"kubernetes.io/projected/bec1d36d-4ff4-4f29-9d04-59f088e00f09-kube-api-access-cbrks\") pod \"bec1d36d-4ff4-4f29-9d04-59f088e00f09\" (UID: \"bec1d36d-4ff4-4f29-9d04-59f088e00f09\") " Mar 21 05:34:05 crc kubenswrapper[4839]: I0321 05:34:05.696555 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bec1d36d-4ff4-4f29-9d04-59f088e00f09-kube-api-access-cbrks" (OuterVolumeSpecName: "kube-api-access-cbrks") pod "bec1d36d-4ff4-4f29-9d04-59f088e00f09" (UID: "bec1d36d-4ff4-4f29-9d04-59f088e00f09"). InnerVolumeSpecName "kube-api-access-cbrks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:34:05 crc kubenswrapper[4839]: I0321 05:34:05.792645 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbrks\" (UniqueName: \"kubernetes.io/projected/bec1d36d-4ff4-4f29-9d04-59f088e00f09-kube-api-access-cbrks\") on node \"crc\" DevicePath \"\"" Mar 21 05:34:05 crc kubenswrapper[4839]: I0321 05:34:05.865157 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-d9cf4c794-jb7lf_37ba14c5-dfc7-4268-86c9-c0efe37fe6c9/barbican-api-log/0.log" Mar 21 05:34:05 crc kubenswrapper[4839]: I0321 05:34:05.883431 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7b946d96f4-chv76_e6e03301-fb6e-467b-b19d-21b5c475d35c/barbican-keystone-listener/0.log" Mar 21 05:34:05 crc kubenswrapper[4839]: I0321 05:34:05.918295 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7b946d96f4-chv76_e6e03301-fb6e-467b-b19d-21b5c475d35c/barbican-keystone-listener-log/0.log" Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.111796 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-db77b8b5f-grbp8_3563c0f9-9e82-4798-bae3-b3836a6b5866/barbican-worker/0.log" Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.119611 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-db77b8b5f-grbp8_3563c0f9-9e82-4798-bae3-b3836a6b5866/barbican-worker-log/0.log" Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.214189 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567854-85fvh" event={"ID":"bec1d36d-4ff4-4f29-9d04-59f088e00f09","Type":"ContainerDied","Data":"31223f38f218f9eee4ae54c62fd90b1f022c42abbeb7d89f51483d3b58494628"} Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.214527 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31223f38f218f9eee4ae54c62fd90b1f022c42abbeb7d89f51483d3b58494628" Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.214267 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567854-85fvh" Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.570616 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d1041d12-2cae-4009-a3f3-9df6e219d03b/ceilometer-central-agent/0.log" Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.647018 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-ndptz_a1d76458-d587-4960-9bcc-7e3d3122b44d/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.665839 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567848-p6wgd"] Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.681056 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567848-p6wgd"] Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.729639 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d1041d12-2cae-4009-a3f3-9df6e219d03b/ceilometer-notification-agent/0.log" Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.808703 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d1041d12-2cae-4009-a3f3-9df6e219d03b/proxy-httpd/0.log" Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.901459 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d1041d12-2cae-4009-a3f3-9df6e219d03b/sg-core/0.log" Mar 21 05:34:06 crc kubenswrapper[4839]: I0321 05:34:06.993300 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5162af3c-3b00-4643-afd9-680f6e2f5c03/cinder-api-log/0.log" Mar 21 05:34:07 crc kubenswrapper[4839]: I0321 05:34:07.046599 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5162af3c-3b00-4643-afd9-680f6e2f5c03/cinder-api/0.log" Mar 21 05:34:07 crc kubenswrapper[4839]: I0321 05:34:07.148798 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_77964653-d242-4258-b06e-c9cd0fb64d84/cinder-scheduler/0.log" Mar 21 05:34:07 crc kubenswrapper[4839]: I0321 05:34:07.342136 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_77964653-d242-4258-b06e-c9cd0fb64d84/probe/0.log" Mar 21 05:34:07 crc kubenswrapper[4839]: I0321 05:34:07.543417 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-2rvtx_a58d82e4-2de9-4680-a08c-6eeb775ed08a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:07 crc kubenswrapper[4839]: I0321 05:34:07.681508 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-qkclf_ab9d4433-fe0e-471b-84f8-568b31920ed3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:07 crc kubenswrapper[4839]: I0321 05:34:07.713666 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-n4nl2_a31699b4-0a8f-42c8-b7f4-319ef1d5423a/init/0.log" Mar 21 05:34:07 crc kubenswrapper[4839]: I0321 05:34:07.977095 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-n4nl2_a31699b4-0a8f-42c8-b7f4-319ef1d5423a/init/0.log" Mar 21 05:34:08 crc kubenswrapper[4839]: I0321 05:34:08.130822 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-bs7tt_7f875f01-020a-4cd6-950a-4dbb6ccb344e/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:08 crc kubenswrapper[4839]: I0321 05:34:08.137275 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-n4nl2_a31699b4-0a8f-42c8-b7f4-319ef1d5423a/dnsmasq-dns/0.log" Mar 21 05:34:08 crc kubenswrapper[4839]: I0321 05:34:08.219677 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3e3e15ec-7425-4e0a-99a8-db3bb1cd486c/glance-httpd/0.log" Mar 21 05:34:08 crc kubenswrapper[4839]: I0321 05:34:08.314326 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3e3e15ec-7425-4e0a-99a8-db3bb1cd486c/glance-log/0.log" Mar 21 05:34:08 crc kubenswrapper[4839]: I0321 05:34:08.462319 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c7aa4192-53bb-412e-b25e-1fe47c59fa75/glance-httpd/0.log" Mar 21 05:34:08 crc kubenswrapper[4839]: I0321 05:34:08.464463 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8214f95-33aa-486b-bb82-915b2c5b2cf6" path="/var/lib/kubelet/pods/c8214f95-33aa-486b-bb82-915b2c5b2cf6/volumes" Mar 21 05:34:08 crc kubenswrapper[4839]: I0321 05:34:08.507442 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c7aa4192-53bb-412e-b25e-1fe47c59fa75/glance-log/0.log" Mar 21 05:34:08 crc kubenswrapper[4839]: I0321 05:34:08.692865 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-9c97f4dbd-k2scs_579308eb-854d-4160-ad35-8677f2d0e634/horizon/0.log" Mar 21 05:34:08 crc kubenswrapper[4839]: I0321 05:34:08.919159 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-s7pk7_268d87b5-57ec-49ff-be62-fe59e6b4b819/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:09 crc kubenswrapper[4839]: I0321 05:34:09.174090 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-9c97f4dbd-k2scs_579308eb-854d-4160-ad35-8677f2d0e634/horizon-log/0.log" Mar 21 05:34:09 crc kubenswrapper[4839]: I0321 05:34:09.345751 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-xdvx2_7538d496-3768-42b7-9f2e-70e1b44a9d6b/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:09 crc kubenswrapper[4839]: I0321 05:34:09.409668 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cb996784d-fvhvp_6a3fcdf0-3099-467b-928b-89a4876130fe/keystone-api/0.log" Mar 21 05:34:09 crc kubenswrapper[4839]: I0321 05:34:09.669532 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29567821-rmctn_666be2f4-0416-4086-94d3-c48c82f380b2/keystone-cron/0.log" Mar 21 05:34:09 crc kubenswrapper[4839]: I0321 05:34:09.776283 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_1626316f-b029-4424-b783-25eeb2790eb2/kube-state-metrics/0.log" Mar 21 05:34:10 crc kubenswrapper[4839]: I0321 05:34:10.361549 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-748dbf85fc-jslwv_cd21ac8b-d3c0-4f0c-9205-d60d55425d8a/neutron-api/0.log" Mar 21 05:34:10 crc kubenswrapper[4839]: I0321 05:34:10.400529 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-w48j6_2d056acb-0183-4157-a830-fff4cd1dcacf/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:11 crc kubenswrapper[4839]: I0321 05:34:11.059812 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-748dbf85fc-jslwv_cd21ac8b-d3c0-4f0c-9205-d60d55425d8a/neutron-httpd/0.log" Mar 21 05:34:11 crc kubenswrapper[4839]: I0321 05:34:11.143843 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-vx26d_ceef8f42-5d77-44c1-ac39-edf0080f68e0/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:11 crc kubenswrapper[4839]: I0321 05:34:11.721750 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_152d0351-12d2-4cf1-ad49-fd943b223442/nova-cell0-conductor-conductor/0.log" Mar 21 05:34:11 crc kubenswrapper[4839]: I0321 05:34:11.763339 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_627bf6a3-cf5d-42e1-9250-ba6684bb2cfc/nova-api-log/0.log" Mar 21 05:34:12 crc kubenswrapper[4839]: I0321 05:34:12.037244 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_3194b187-fe06-4eed-b725-995cef2b05a0/nova-cell1-conductor-conductor/0.log" Mar 21 05:34:12 crc kubenswrapper[4839]: I0321 05:34:12.174386 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_9ddf8fc2-ec2a-4b98-aa76-2dc43426e3f2/nova-cell1-novncproxy-novncproxy/0.log" Mar 21 05:34:12 crc kubenswrapper[4839]: I0321 05:34:12.275052 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_627bf6a3-cf5d-42e1-9250-ba6684bb2cfc/nova-api-api/0.log" Mar 21 05:34:12 crc kubenswrapper[4839]: I0321 05:34:12.717864 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-hf42f_3f8728ca-30ff-41a9-8a48-e3bb7911bcc7/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:12 crc kubenswrapper[4839]: I0321 05:34:12.986101 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0aafbc7f-e890-4a32-8531-f148aeea18e6/nova-metadata-log/0.log" Mar 21 05:34:13 crc kubenswrapper[4839]: I0321 05:34:13.361422 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0aafbc7f-e890-4a32-8531-f148aeea18e6/nova-metadata-metadata/0.log" Mar 21 05:34:13 crc kubenswrapper[4839]: I0321 05:34:13.409951 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_30d22e92-45bd-4d1e-954e-3ade801245d4/mysql-bootstrap/0.log" Mar 21 05:34:13 crc kubenswrapper[4839]: I0321 05:34:13.510199 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_bbecccff-0ecc-44ff-a57b-f7289b8bcf5a/nova-scheduler-scheduler/0.log" Mar 21 05:34:13 crc kubenswrapper[4839]: I0321 05:34:13.590468 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_30d22e92-45bd-4d1e-954e-3ade801245d4/mysql-bootstrap/0.log" Mar 21 05:34:13 crc kubenswrapper[4839]: I0321 05:34:13.660574 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_30d22e92-45bd-4d1e-954e-3ade801245d4/galera/0.log" Mar 21 05:34:13 crc kubenswrapper[4839]: I0321 05:34:13.799602 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4f1edf0d-f220-4815-aeb6-e4507576247a/mysql-bootstrap/0.log" Mar 21 05:34:13 crc kubenswrapper[4839]: I0321 05:34:13.983999 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4f1edf0d-f220-4815-aeb6-e4507576247a/mysql-bootstrap/0.log" Mar 21 05:34:13 crc kubenswrapper[4839]: I0321 05:34:13.996792 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4f1edf0d-f220-4815-aeb6-e4507576247a/galera/0.log" Mar 21 05:34:14 crc kubenswrapper[4839]: I0321 05:34:14.024599 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_52b9f7e1-d86c-457e-9391-eee855a9f7a7/openstackclient/0.log" Mar 21 05:34:14 crc kubenswrapper[4839]: I0321 05:34:14.269840 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-mx5tf_64d13111-845e-4c61-a4ce-483ddfb799b7/openstack-network-exporter/0.log" Mar 21 05:34:14 crc kubenswrapper[4839]: I0321 05:34:14.342135 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hrww8_3d74e911-e100-4e79-89be-202e06bb4d30/ovsdb-server-init/0.log" Mar 21 05:34:14 crc kubenswrapper[4839]: I0321 05:34:14.459625 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hrww8_3d74e911-e100-4e79-89be-202e06bb4d30/ovsdb-server-init/0.log" Mar 21 05:34:14 crc kubenswrapper[4839]: I0321 05:34:14.508787 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hrww8_3d74e911-e100-4e79-89be-202e06bb4d30/ovs-vswitchd/0.log" Mar 21 05:34:14 crc kubenswrapper[4839]: I0321 05:34:14.649106 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hrww8_3d74e911-e100-4e79-89be-202e06bb4d30/ovsdb-server/0.log" Mar 21 05:34:14 crc kubenswrapper[4839]: I0321 05:34:14.657021 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-qt5s4_b31b64cb-0266-4b8a-9fcb-ae5e36c8309a/ovn-controller/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.006198 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_dbcaa531-3e09-48c7-8535-76f3e1f5c303/ovn-northd/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.019720 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_dbcaa531-3e09-48c7-8535-76f3e1f5c303/openstack-network-exporter/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.045115 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-v4wqq_7e5c69e5-d234-4a0b-a327-1cf44ddaf1bd/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.254145 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4a7a1028-3deb-4033-890c-db0861c6a9a2/openstack-network-exporter/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.270031 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4a7a1028-3deb-4033-890c-db0861c6a9a2/ovsdbserver-nb/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.396639 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8c2e5ef4-e4c0-4278-897e-ce5d00b4079d/openstack-network-exporter/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.505608 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8c2e5ef4-e4c0-4278-897e-ce5d00b4079d/ovsdbserver-sb/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.587623 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-75bd8b89b4-djjlh_bf5a44f8-8eb1-4953-b611-a02576e414ea/placement-api/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.697633 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-75bd8b89b4-djjlh_bf5a44f8-8eb1-4953-b611-a02576e414ea/placement-log/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.717445 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fa82c4a0-2b0e-4e22-9e91-7fc899122414/setup-container/0.log" Mar 21 05:34:15 crc kubenswrapper[4839]: I0321 05:34:15.983361 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fa82c4a0-2b0e-4e22-9e91-7fc899122414/setup-container/0.log" Mar 21 05:34:16 crc kubenswrapper[4839]: I0321 05:34:16.014062 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fa82c4a0-2b0e-4e22-9e91-7fc899122414/rabbitmq/0.log" Mar 21 05:34:16 crc kubenswrapper[4839]: I0321 05:34:16.056561 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bfff67da-8ea4-4798-9b8d-58a3abac4347/setup-container/0.log" Mar 21 05:34:16 crc kubenswrapper[4839]: I0321 05:34:16.211163 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bfff67da-8ea4-4798-9b8d-58a3abac4347/setup-container/0.log" Mar 21 05:34:16 crc kubenswrapper[4839]: I0321 05:34:16.227098 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_bfff67da-8ea4-4798-9b8d-58a3abac4347/rabbitmq/0.log" Mar 21 05:34:16 crc kubenswrapper[4839]: I0321 05:34:16.274417 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-mkn9r_66c3e343-3306-455d-89d7-db17c1bd53ed/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:16 crc kubenswrapper[4839]: I0321 05:34:16.470010 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-pgfnn_a6dd2bff-543f-4ebb-b908-3e528f322548/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:16 crc kubenswrapper[4839]: I0321 05:34:16.540290 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-mbmlq_acb0bb61-c53a-4171-bca5-4a3141d6904a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.096333 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-chfcw_39dbacec-c845-4f19-92a9-c0e63fba203c/ssh-known-hosts-edpm-deployment/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.114585 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-55fzl_26adbd7b-7994-4bea-9f94-338881339833/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.320775 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-b66c6bfff-76gfx_1af5fd5b-8392-4e55-b3fb-fdc9285dd135/proxy-server/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.404345 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-b66c6bfff-76gfx_1af5fd5b-8392-4e55-b3fb-fdc9285dd135/proxy-httpd/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.445734 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-kkvzq_5484abbf-53f2-445a-b6fe-0996eba95345/swift-ring-rebalance/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.618038 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/account-auditor/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.663036 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/account-replicator/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.691630 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/account-reaper/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.803710 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/account-server/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.813313 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/container-auditor/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.883719 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/container-replicator/0.log" Mar 21 05:34:17 crc kubenswrapper[4839]: I0321 05:34:17.927131 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/container-server/0.log" Mar 21 05:34:18 crc kubenswrapper[4839]: I0321 05:34:18.052537 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/container-updater/0.log" Mar 21 05:34:18 crc kubenswrapper[4839]: I0321 05:34:18.058079 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/object-auditor/0.log" Mar 21 05:34:18 crc kubenswrapper[4839]: I0321 05:34:18.081720 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/object-expirer/0.log" Mar 21 05:34:18 crc kubenswrapper[4839]: I0321 05:34:18.234750 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/object-replicator/0.log" Mar 21 05:34:18 crc kubenswrapper[4839]: I0321 05:34:18.257686 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/object-updater/0.log" Mar 21 05:34:18 crc kubenswrapper[4839]: I0321 05:34:18.311674 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/object-server/0.log" Mar 21 05:34:18 crc kubenswrapper[4839]: I0321 05:34:18.349957 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/rsync/0.log" Mar 21 05:34:18 crc kubenswrapper[4839]: I0321 05:34:18.450473 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9848d2f0-c562-4b2a-bd1c-cd91c6754079/swift-recon-cron/0.log" Mar 21 05:34:18 crc kubenswrapper[4839]: I0321 05:34:18.675041 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_65c89a5c-df7e-4d6c-bc7e-5e4fecfc6cb3/tempest-tests-tempest-tests-runner/0.log" Mar 21 05:34:19 crc kubenswrapper[4839]: I0321 05:34:19.014437 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-zmjtq_4f49b501-bec5-4fe1-89d7-ff3c217ba580/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:19 crc kubenswrapper[4839]: I0321 05:34:19.033212 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_d4f9fe7e-8922-4015-a3eb-8c0f829cc5f8/test-operator-logs-container/0.log" Mar 21 05:34:19 crc kubenswrapper[4839]: I0321 05:34:19.186783 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-jsp4h_f9d60b3b-b1b4-4d98-9da2-e152ac410c81/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 21 05:34:28 crc kubenswrapper[4839]: I0321 05:34:28.720082 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3c49bdbb-0c05-4dea-8de8-61ca09b7e84c/memcached/0.log" Mar 21 05:34:46 crc kubenswrapper[4839]: I0321 05:34:46.180556 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-2mkmz_0c51ffa0-2285-4f7e-af09-0cafba139934/manager/0.log" Mar 21 05:34:46 crc kubenswrapper[4839]: I0321 05:34:46.291295 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-9s4vt_ee9d64a7-0d03-4cb0-a266-47b26f9957b5/manager/0.log" Mar 21 05:34:46 crc kubenswrapper[4839]: I0321 05:34:46.477353 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/util/0.log" Mar 21 05:34:46 crc kubenswrapper[4839]: I0321 05:34:46.689809 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/pull/0.log" Mar 21 05:34:46 crc kubenswrapper[4839]: I0321 05:34:46.717359 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/pull/0.log" Mar 21 05:34:46 crc kubenswrapper[4839]: I0321 05:34:46.771330 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/util/0.log" Mar 21 05:34:46 crc kubenswrapper[4839]: I0321 05:34:46.907280 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/util/0.log" Mar 21 05:34:46 crc kubenswrapper[4839]: I0321 05:34:46.919995 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/pull/0.log" Mar 21 05:34:47 crc kubenswrapper[4839]: I0321 05:34:47.022823 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e32239b1d0c83face19bfb7ecc802c1e0782962973f0d7ba9f259e86a2x5d6m_f63f3493-d532-4d99-94c0-ab8648252dab/extract/0.log" Mar 21 05:34:47 crc kubenswrapper[4839]: I0321 05:34:47.241322 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-6s6q7_d3dc722f-f66c-46a0-9b1a-ae1b9c4de060/manager/0.log" Mar 21 05:34:47 crc kubenswrapper[4839]: I0321 05:34:47.346682 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-2n27d_fd731e7e-440b-4e77-a778-08a4a62e0c9f/manager/0.log" Mar 21 05:34:47 crc kubenswrapper[4839]: I0321 05:34:47.578387 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-dncxc_05f30a88-e899-4727-9440-981d010a1342/manager/0.log" Mar 21 05:34:47 crc kubenswrapper[4839]: I0321 05:34:47.971535 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-d7h7r_acb1d7ac-b3f9-4564-8346-344ffb5c3964/manager/0.log" Mar 21 05:34:48 crc kubenswrapper[4839]: I0321 05:34:48.238719 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-8sg4d_ccec0d11-294b-43a2-be2e-fcef8a6818c6/manager/0.log" Mar 21 05:34:48 crc kubenswrapper[4839]: I0321 05:34:48.295474 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-bsdjs_ca2a8cd0-1c71-45bb-b4fc-4c7f82515b3b/manager/0.log" Mar 21 05:34:48 crc kubenswrapper[4839]: I0321 05:34:48.471796 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-gzh8j_6074766c-0ecd-4051-a676-dcc21b24184f/manager/0.log" Mar 21 05:34:48 crc kubenswrapper[4839]: I0321 05:34:48.495812 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-k4lg5_7a7bf7a3-acea-4059-8a89-db576f3588d1/manager/0.log" Mar 21 05:34:48 crc kubenswrapper[4839]: I0321 05:34:48.679803 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-sp4j4_2162bafb-7e49-435c-9591-d8b725f10336/manager/0.log" Mar 21 05:34:48 crc kubenswrapper[4839]: I0321 05:34:48.774586 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-94vpf_70702cd5-6815-4a01-98a4-2f4dfaeef839/manager/0.log" Mar 21 05:34:48 crc kubenswrapper[4839]: I0321 05:34:48.985313 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-wjw9j_6914418f-3639-4ebc-a58d-d8b478cbf6b4/manager/0.log" Mar 21 05:34:48 crc kubenswrapper[4839]: I0321 05:34:48.996107 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-6p4mn_faac458b-73d9-4fb8-9f1c-50f7521088b0/manager/0.log" Mar 21 05:34:49 crc kubenswrapper[4839]: I0321 05:34:49.186272 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-8gc22_859b11bc-e9fb-40a2-a053-66a07337965c/manager/0.log" Mar 21 05:34:49 crc kubenswrapper[4839]: I0321 05:34:49.303064 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-948579bb7-j6fx6_b27308cc-b2b7-4bf0-a3ca-55ccdfa47f59/operator/0.log" Mar 21 05:34:49 crc kubenswrapper[4839]: I0321 05:34:49.576905 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-lj8h4_6ff65f56-ff89-43c6-b087-6d3c3b72d2ef/registry-server/0.log" Mar 21 05:34:49 crc kubenswrapper[4839]: I0321 05:34:49.819690 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-qt58c_379b40a1-e3f5-448b-b668-0f168457e5d0/manager/0.log" Mar 21 05:34:49 crc kubenswrapper[4839]: I0321 05:34:49.880733 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-x75fd_361c2d7b-9a75-41fd-953d-4b1bd64ca6df/manager/0.log" Mar 21 05:34:50 crc kubenswrapper[4839]: I0321 05:34:50.123130 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-lzbtt_c8584ecb-dc92-4cec-9178-3017f09095da/operator/0.log" Mar 21 05:34:50 crc kubenswrapper[4839]: I0321 05:34:50.152231 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-xt7xt_2045f5d2-c67e-47cd-b16d-3c69d449f099/manager/0.log" Mar 21 05:34:50 crc kubenswrapper[4839]: I0321 05:34:50.420806 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-7f4qh_5eeb53bd-3988-458f-baa5-d265e0178aea/manager/0.log" Mar 21 05:34:50 crc kubenswrapper[4839]: I0321 05:34:50.423338 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-btkvt_d3ea9c2e-11a4-492e-9e84-8294e81ce775/manager/0.log" Mar 21 05:34:50 crc kubenswrapper[4839]: I0321 05:34:50.537483 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-hh27s_1d32b541-7b80-492b-adac-e51d5090b668/manager/0.log" Mar 21 05:34:50 crc kubenswrapper[4839]: I0321 05:34:50.670528 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5ccd4855ff-jx6pn_06f9e67e-8978-46a1-9dc8-c511197241e2/manager/0.log" Mar 21 05:34:58 crc kubenswrapper[4839]: I0321 05:34:58.361866 4839 scope.go:117] "RemoveContainer" containerID="3c4dbc17150a4b84d9f816e99c3c6823e1cf60ce3010cad74846a38e98f64886" Mar 21 05:35:10 crc kubenswrapper[4839]: I0321 05:35:10.226935 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-whlp9_40014780-8cb8-47fa-8b2c-c4fb7d04a85c/control-plane-machine-set-operator/0.log" Mar 21 05:35:10 crc kubenswrapper[4839]: I0321 05:35:10.467373 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nmj8p_c4d393d7-42d7-4b7d-a3cd-f7e325b97c54/kube-rbac-proxy/0.log" Mar 21 05:35:10 crc kubenswrapper[4839]: I0321 05:35:10.480272 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nmj8p_c4d393d7-42d7-4b7d-a3cd-f7e325b97c54/machine-api-operator/0.log" Mar 21 05:35:26 crc kubenswrapper[4839]: I0321 05:35:26.617429 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-x2cpt_daed7a16-7023-463e-9d60-3f56f091f73e/cert-manager-controller/0.log" Mar 21 05:35:26 crc kubenswrapper[4839]: I0321 05:35:26.876898 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-v297k_814a91ac-5e2f-4479-88a3-254e4216e50c/cert-manager-cainjector/0.log" Mar 21 05:35:26 crc kubenswrapper[4839]: I0321 05:35:26.930285 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-s9zj6_d70f5b8f-f5a8-4829-b4e1-7a7a12dddd1f/cert-manager-webhook/0.log" Mar 21 05:35:40 crc kubenswrapper[4839]: I0321 05:35:40.002557 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-j5z4g_8e7a66bb-3731-4f75-9a7f-5b9d07a36b39/nmstate-console-plugin/0.log" Mar 21 05:35:40 crc kubenswrapper[4839]: I0321 05:35:40.199140 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-k57vv_42329e42-8b9b-45ed-ab04-bf12468d8859/nmstate-handler/0.log" Mar 21 05:35:40 crc kubenswrapper[4839]: I0321 05:35:40.252085 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-z5wkc_fdc1639d-742f-41a6-8cb7-318997a4a8b1/kube-rbac-proxy/0.log" Mar 21 05:35:40 crc kubenswrapper[4839]: I0321 05:35:40.559760 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-vrlf4_fbd83ba5-ac43-45f6-8a15-78ba82a246f7/nmstate-operator/0.log" Mar 21 05:35:41 crc kubenswrapper[4839]: I0321 05:35:41.073026 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-z5wkc_fdc1639d-742f-41a6-8cb7-318997a4a8b1/nmstate-metrics/0.log" Mar 21 05:35:41 crc kubenswrapper[4839]: I0321 05:35:41.241408 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-7ghd4_5a2485ca-cb21-4edf-b074-f7ac255f45f8/nmstate-webhook/0.log" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.153887 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567856-pzqfl"] Mar 21 05:36:00 crc kubenswrapper[4839]: E0321 05:36:00.155717 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec1d36d-4ff4-4f29-9d04-59f088e00f09" containerName="oc" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.155735 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec1d36d-4ff4-4f29-9d04-59f088e00f09" containerName="oc" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.156003 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="bec1d36d-4ff4-4f29-9d04-59f088e00f09" containerName="oc" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.156871 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567856-pzqfl" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.159220 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.159603 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.159972 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.162530 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8vsj\" (UniqueName: \"kubernetes.io/projected/a61d1142-1394-4cf7-a8f7-6f1841a6694d-kube-api-access-f8vsj\") pod \"auto-csr-approver-29567856-pzqfl\" (UID: \"a61d1142-1394-4cf7-a8f7-6f1841a6694d\") " pod="openshift-infra/auto-csr-approver-29567856-pzqfl" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.178884 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567856-pzqfl"] Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.265787 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8vsj\" (UniqueName: \"kubernetes.io/projected/a61d1142-1394-4cf7-a8f7-6f1841a6694d-kube-api-access-f8vsj\") pod \"auto-csr-approver-29567856-pzqfl\" (UID: \"a61d1142-1394-4cf7-a8f7-6f1841a6694d\") " pod="openshift-infra/auto-csr-approver-29567856-pzqfl" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.293881 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8vsj\" (UniqueName: \"kubernetes.io/projected/a61d1142-1394-4cf7-a8f7-6f1841a6694d-kube-api-access-f8vsj\") pod \"auto-csr-approver-29567856-pzqfl\" (UID: \"a61d1142-1394-4cf7-a8f7-6f1841a6694d\") " pod="openshift-infra/auto-csr-approver-29567856-pzqfl" Mar 21 05:36:00 crc kubenswrapper[4839]: I0321 05:36:00.530411 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567856-pzqfl" Mar 21 05:36:01 crc kubenswrapper[4839]: I0321 05:36:01.003275 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567856-pzqfl"] Mar 21 05:36:01 crc kubenswrapper[4839]: I0321 05:36:01.299681 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:36:02 crc kubenswrapper[4839]: I0321 05:36:02.006552 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567856-pzqfl" event={"ID":"a61d1142-1394-4cf7-a8f7-6f1841a6694d","Type":"ContainerStarted","Data":"ad2b19f1e54e88969f7e8e11b355551de72d748b4608d9eeafcca46f8f257e9d"} Mar 21 05:36:04 crc kubenswrapper[4839]: I0321 05:36:04.023916 4839 generic.go:334] "Generic (PLEG): container finished" podID="a61d1142-1394-4cf7-a8f7-6f1841a6694d" containerID="3b72b1b0e5d05a1d6603f6bb93e0270d894bb25f1de761d8b7c5c8644a45fe83" exitCode=0 Mar 21 05:36:04 crc kubenswrapper[4839]: I0321 05:36:04.023980 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567856-pzqfl" event={"ID":"a61d1142-1394-4cf7-a8f7-6f1841a6694d","Type":"ContainerDied","Data":"3b72b1b0e5d05a1d6603f6bb93e0270d894bb25f1de761d8b7c5c8644a45fe83"} Mar 21 05:36:05 crc kubenswrapper[4839]: I0321 05:36:05.371079 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567856-pzqfl" Mar 21 05:36:05 crc kubenswrapper[4839]: I0321 05:36:05.484516 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8vsj\" (UniqueName: \"kubernetes.io/projected/a61d1142-1394-4cf7-a8f7-6f1841a6694d-kube-api-access-f8vsj\") pod \"a61d1142-1394-4cf7-a8f7-6f1841a6694d\" (UID: \"a61d1142-1394-4cf7-a8f7-6f1841a6694d\") " Mar 21 05:36:05 crc kubenswrapper[4839]: I0321 05:36:05.490174 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a61d1142-1394-4cf7-a8f7-6f1841a6694d-kube-api-access-f8vsj" (OuterVolumeSpecName: "kube-api-access-f8vsj") pod "a61d1142-1394-4cf7-a8f7-6f1841a6694d" (UID: "a61d1142-1394-4cf7-a8f7-6f1841a6694d"). InnerVolumeSpecName "kube-api-access-f8vsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:36:05 crc kubenswrapper[4839]: I0321 05:36:05.587083 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8vsj\" (UniqueName: \"kubernetes.io/projected/a61d1142-1394-4cf7-a8f7-6f1841a6694d-kube-api-access-f8vsj\") on node \"crc\" DevicePath \"\"" Mar 21 05:36:06 crc kubenswrapper[4839]: I0321 05:36:06.047432 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567856-pzqfl" event={"ID":"a61d1142-1394-4cf7-a8f7-6f1841a6694d","Type":"ContainerDied","Data":"ad2b19f1e54e88969f7e8e11b355551de72d748b4608d9eeafcca46f8f257e9d"} Mar 21 05:36:06 crc kubenswrapper[4839]: I0321 05:36:06.047479 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567856-pzqfl" Mar 21 05:36:06 crc kubenswrapper[4839]: I0321 05:36:06.047501 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad2b19f1e54e88969f7e8e11b355551de72d748b4608d9eeafcca46f8f257e9d" Mar 21 05:36:06 crc kubenswrapper[4839]: I0321 05:36:06.438873 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567850-6q472"] Mar 21 05:36:06 crc kubenswrapper[4839]: I0321 05:36:06.447636 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567850-6q472"] Mar 21 05:36:06 crc kubenswrapper[4839]: I0321 05:36:06.464834 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ff27433-bc42-4edf-bcac-48ffe5e0680a" path="/var/lib/kubelet/pods/5ff27433-bc42-4edf-bcac-48ffe5e0680a/volumes" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.180901 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-q9zb9_f0373e22-a3f9-48c6-abd6-fc8147ea49e6/kube-rbac-proxy/0.log" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.285718 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-q9zb9_f0373e22-a3f9-48c6-abd6-fc8147ea49e6/controller/0.log" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.425077 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-frr-files/0.log" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.540362 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-reloader/0.log" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.581462 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-metrics/0.log" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.590710 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-frr-files/0.log" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.625523 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-reloader/0.log" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.868836 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-metrics/0.log" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.872045 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-frr-files/0.log" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.878453 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-reloader/0.log" Mar 21 05:36:10 crc kubenswrapper[4839]: I0321 05:36:10.882099 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-metrics/0.log" Mar 21 05:36:11 crc kubenswrapper[4839]: I0321 05:36:11.061280 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-reloader/0.log" Mar 21 05:36:11 crc kubenswrapper[4839]: I0321 05:36:11.064492 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-metrics/0.log" Mar 21 05:36:11 crc kubenswrapper[4839]: I0321 05:36:11.087120 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/cp-frr-files/0.log" Mar 21 05:36:11 crc kubenswrapper[4839]: I0321 05:36:11.129391 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/controller/0.log" Mar 21 05:36:11 crc kubenswrapper[4839]: I0321 05:36:11.264242 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/frr-metrics/0.log" Mar 21 05:36:11 crc kubenswrapper[4839]: I0321 05:36:11.275462 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/kube-rbac-proxy/0.log" Mar 21 05:36:11 crc kubenswrapper[4839]: I0321 05:36:11.345560 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/kube-rbac-proxy-frr/0.log" Mar 21 05:36:11 crc kubenswrapper[4839]: I0321 05:36:11.476763 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/reloader/0.log" Mar 21 05:36:11 crc kubenswrapper[4839]: I0321 05:36:11.562642 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-qm7jb_06b3d06a-d515-469a-9a88-77b3f1e6c6f0/frr-k8s-webhook-server/0.log" Mar 21 05:36:11 crc kubenswrapper[4839]: I0321 05:36:11.735866 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b8d865685-2pk4g_888cdc0b-241d-456a-9a9f-3ed253b3dbf3/manager/0.log" Mar 21 05:36:12 crc kubenswrapper[4839]: I0321 05:36:12.288289 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b2wb4_6b330e86-2ac2-4bee-8a6e-364cb2f093d7/kube-rbac-proxy/0.log" Mar 21 05:36:12 crc kubenswrapper[4839]: I0321 05:36:12.289988 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7df97b96d6-7wvzr_ca0627e2-8115-4514-ba93-47e00a823a31/webhook-server/0.log" Mar 21 05:36:13 crc kubenswrapper[4839]: I0321 05:36:13.052394 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lzrf7_822ff984-89c3-48d0-b420-4ecf223f8176/frr/0.log" Mar 21 05:36:13 crc kubenswrapper[4839]: I0321 05:36:13.068601 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b2wb4_6b330e86-2ac2-4bee-8a6e-364cb2f093d7/speaker/0.log" Mar 21 05:36:27 crc kubenswrapper[4839]: I0321 05:36:27.882798 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/util/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.030850 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/pull/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.069931 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/pull/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.082946 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/util/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.261197 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/util/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.283208 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/extract/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.293900 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bb5km_e0eea72c-ae42-4ea4-a067-6ff3e853c081/pull/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.449281 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/util/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.682468 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/pull/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.690381 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/pull/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.695705 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/util/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.868902 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/extract/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.878142 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/util/0.log" Mar 21 05:36:28 crc kubenswrapper[4839]: I0321 05:36:28.904856 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18p48w_cb3471d2-6268-4816-bc09-31044e9989e7/pull/0.log" Mar 21 05:36:29 crc kubenswrapper[4839]: I0321 05:36:29.034359 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-utilities/0.log" Mar 21 05:36:29 crc kubenswrapper[4839]: I0321 05:36:29.216242 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-utilities/0.log" Mar 21 05:36:29 crc kubenswrapper[4839]: I0321 05:36:29.217650 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-content/0.log" Mar 21 05:36:29 crc kubenswrapper[4839]: I0321 05:36:29.222119 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-content/0.log" Mar 21 05:36:29 crc kubenswrapper[4839]: I0321 05:36:29.519352 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-utilities/0.log" Mar 21 05:36:29 crc kubenswrapper[4839]: I0321 05:36:29.593002 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/extract-content/0.log" Mar 21 05:36:29 crc kubenswrapper[4839]: I0321 05:36:29.817224 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-utilities/0.log" Mar 21 05:36:30 crc kubenswrapper[4839]: I0321 05:36:30.015916 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-utilities/0.log" Mar 21 05:36:30 crc kubenswrapper[4839]: I0321 05:36:30.082477 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-content/0.log" Mar 21 05:36:30 crc kubenswrapper[4839]: I0321 05:36:30.113164 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xg8xw_1d4943ad-c109-47a0-bcc8-4eb1a89836ca/registry-server/0.log" Mar 21 05:36:30 crc kubenswrapper[4839]: I0321 05:36:30.121896 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-content/0.log" Mar 21 05:36:30 crc kubenswrapper[4839]: I0321 05:36:30.344959 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-utilities/0.log" Mar 21 05:36:30 crc kubenswrapper[4839]: I0321 05:36:30.375792 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/extract-content/0.log" Mar 21 05:36:30 crc kubenswrapper[4839]: I0321 05:36:30.573481 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qb9bp_df9bf95b-dc8f-4104-9c6c-873159393850/marketplace-operator/0.log" Mar 21 05:36:30 crc kubenswrapper[4839]: I0321 05:36:30.980735 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:36:30 crc kubenswrapper[4839]: I0321 05:36:30.980815 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:36:30 crc kubenswrapper[4839]: I0321 05:36:30.982372 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-utilities/0.log" Mar 21 05:36:31 crc kubenswrapper[4839]: I0321 05:36:31.192909 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-utilities/0.log" Mar 21 05:36:31 crc kubenswrapper[4839]: I0321 05:36:31.201768 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-99hx2_51f96bb3-505b-4c7b-bc6d-b0a465c7daae/registry-server/0.log" Mar 21 05:36:31 crc kubenswrapper[4839]: I0321 05:36:31.233093 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-content/0.log" Mar 21 05:36:31 crc kubenswrapper[4839]: I0321 05:36:31.253743 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-content/0.log" Mar 21 05:36:31 crc kubenswrapper[4839]: I0321 05:36:31.424300 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-content/0.log" Mar 21 05:36:31 crc kubenswrapper[4839]: I0321 05:36:31.438787 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/extract-utilities/0.log" Mar 21 05:36:31 crc kubenswrapper[4839]: I0321 05:36:31.622311 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7sxqv_e28c0850-90f8-445b-be34-13ab0d940eb4/registry-server/0.log" Mar 21 05:36:32 crc kubenswrapper[4839]: I0321 05:36:32.269444 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-utilities/0.log" Mar 21 05:36:32 crc kubenswrapper[4839]: I0321 05:36:32.296696 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-utilities/0.log" Mar 21 05:36:32 crc kubenswrapper[4839]: I0321 05:36:32.355226 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-content/0.log" Mar 21 05:36:32 crc kubenswrapper[4839]: I0321 05:36:32.443352 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-utilities/0.log" Mar 21 05:36:32 crc kubenswrapper[4839]: I0321 05:36:32.450688 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-content/0.log" Mar 21 05:36:32 crc kubenswrapper[4839]: I0321 05:36:32.509926 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/extract-content/0.log" Mar 21 05:36:33 crc kubenswrapper[4839]: I0321 05:36:33.057151 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8p22k_d2de7c7a-fc46-44bc-9fad-d346e82f8ebc/registry-server/0.log" Mar 21 05:36:58 crc kubenswrapper[4839]: I0321 05:36:58.483892 4839 scope.go:117] "RemoveContainer" containerID="32ef2594966320293c7652dfc99c30b2eedf27f32e9592ed12c4d3d92de56d1a" Mar 21 05:37:00 crc kubenswrapper[4839]: I0321 05:37:00.980288 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:37:00 crc kubenswrapper[4839]: I0321 05:37:00.980814 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:37:30 crc kubenswrapper[4839]: I0321 05:37:30.980645 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:37:30 crc kubenswrapper[4839]: I0321 05:37:30.981262 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:37:30 crc kubenswrapper[4839]: I0321 05:37:30.981312 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 05:37:30 crc kubenswrapper[4839]: I0321 05:37:30.982136 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9fc249e5d17a6c2fdbc1eaec440716c42661d1c2e7c8e6b17923104003e02fe"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:37:30 crc kubenswrapper[4839]: I0321 05:37:30.982195 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://c9fc249e5d17a6c2fdbc1eaec440716c42661d1c2e7c8e6b17923104003e02fe" gracePeriod=600 Mar 21 05:37:31 crc kubenswrapper[4839]: I0321 05:37:31.154009 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="c9fc249e5d17a6c2fdbc1eaec440716c42661d1c2e7c8e6b17923104003e02fe" exitCode=0 Mar 21 05:37:31 crc kubenswrapper[4839]: I0321 05:37:31.154055 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"c9fc249e5d17a6c2fdbc1eaec440716c42661d1c2e7c8e6b17923104003e02fe"} Mar 21 05:37:31 crc kubenswrapper[4839]: I0321 05:37:31.154096 4839 scope.go:117] "RemoveContainer" containerID="135bd8a875eb5ef2e4c8acd69ed54a00aa689eabe31cc4130e794bbf798a359d" Mar 21 05:37:32 crc kubenswrapper[4839]: I0321 05:37:32.166333 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerStarted","Data":"3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58"} Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.152817 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567858-qlqwl"] Mar 21 05:38:00 crc kubenswrapper[4839]: E0321 05:38:00.153627 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a61d1142-1394-4cf7-a8f7-6f1841a6694d" containerName="oc" Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.153640 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="a61d1142-1394-4cf7-a8f7-6f1841a6694d" containerName="oc" Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.153916 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="a61d1142-1394-4cf7-a8f7-6f1841a6694d" containerName="oc" Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.154447 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567858-qlqwl"] Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.154514 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567858-qlqwl" Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.201241 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.201334 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.201636 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.220159 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdr25\" (UniqueName: \"kubernetes.io/projected/f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f-kube-api-access-hdr25\") pod \"auto-csr-approver-29567858-qlqwl\" (UID: \"f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f\") " pod="openshift-infra/auto-csr-approver-29567858-qlqwl" Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.321597 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdr25\" (UniqueName: \"kubernetes.io/projected/f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f-kube-api-access-hdr25\") pod \"auto-csr-approver-29567858-qlqwl\" (UID: \"f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f\") " pod="openshift-infra/auto-csr-approver-29567858-qlqwl" Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.340247 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdr25\" (UniqueName: \"kubernetes.io/projected/f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f-kube-api-access-hdr25\") pod \"auto-csr-approver-29567858-qlqwl\" (UID: \"f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f\") " pod="openshift-infra/auto-csr-approver-29567858-qlqwl" Mar 21 05:38:00 crc kubenswrapper[4839]: I0321 05:38:00.531929 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567858-qlqwl" Mar 21 05:38:01 crc kubenswrapper[4839]: I0321 05:38:01.007440 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567858-qlqwl"] Mar 21 05:38:01 crc kubenswrapper[4839]: W0321 05:38:01.008982 4839 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf62e08ee_35a1_4db2_9d9a_de4f9de7fd5f.slice/crio-3ad7e02fa72dc04fd3f928962090051c901d4ac848ee613c941d18a83ca24702 WatchSource:0}: Error finding container 3ad7e02fa72dc04fd3f928962090051c901d4ac848ee613c941d18a83ca24702: Status 404 returned error can't find the container with id 3ad7e02fa72dc04fd3f928962090051c901d4ac848ee613c941d18a83ca24702 Mar 21 05:38:01 crc kubenswrapper[4839]: I0321 05:38:01.155353 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567858-qlqwl" event={"ID":"f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f","Type":"ContainerStarted","Data":"3ad7e02fa72dc04fd3f928962090051c901d4ac848ee613c941d18a83ca24702"} Mar 21 05:38:03 crc kubenswrapper[4839]: I0321 05:38:03.191933 4839 generic.go:334] "Generic (PLEG): container finished" podID="f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f" containerID="85a2cbb3a85126b520d32ce4bc2403f6773bb3f095dc1b0013f7736ed37e9add" exitCode=0 Mar 21 05:38:03 crc kubenswrapper[4839]: I0321 05:38:03.191974 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567858-qlqwl" event={"ID":"f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f","Type":"ContainerDied","Data":"85a2cbb3a85126b520d32ce4bc2403f6773bb3f095dc1b0013f7736ed37e9add"} Mar 21 05:38:04 crc kubenswrapper[4839]: I0321 05:38:04.508241 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567858-qlqwl" Mar 21 05:38:04 crc kubenswrapper[4839]: I0321 05:38:04.617306 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdr25\" (UniqueName: \"kubernetes.io/projected/f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f-kube-api-access-hdr25\") pod \"f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f\" (UID: \"f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f\") " Mar 21 05:38:04 crc kubenswrapper[4839]: I0321 05:38:04.623986 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f-kube-api-access-hdr25" (OuterVolumeSpecName: "kube-api-access-hdr25") pod "f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f" (UID: "f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f"). InnerVolumeSpecName "kube-api-access-hdr25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:38:04 crc kubenswrapper[4839]: I0321 05:38:04.720258 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdr25\" (UniqueName: \"kubernetes.io/projected/f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f-kube-api-access-hdr25\") on node \"crc\" DevicePath \"\"" Mar 21 05:38:05 crc kubenswrapper[4839]: I0321 05:38:05.217477 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567858-qlqwl" Mar 21 05:38:05 crc kubenswrapper[4839]: I0321 05:38:05.217474 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567858-qlqwl" event={"ID":"f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f","Type":"ContainerDied","Data":"3ad7e02fa72dc04fd3f928962090051c901d4ac848ee613c941d18a83ca24702"} Mar 21 05:38:05 crc kubenswrapper[4839]: I0321 05:38:05.217561 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ad7e02fa72dc04fd3f928962090051c901d4ac848ee613c941d18a83ca24702" Mar 21 05:38:05 crc kubenswrapper[4839]: I0321 05:38:05.602202 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567852-gb6qv"] Mar 21 05:38:05 crc kubenswrapper[4839]: I0321 05:38:05.612949 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567852-gb6qv"] Mar 21 05:38:06 crc kubenswrapper[4839]: I0321 05:38:06.474200 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1" path="/var/lib/kubelet/pods/d8ef5a8a-ae63-4e94-9c1e-1e7d5e4a99c1/volumes" Mar 21 05:38:33 crc kubenswrapper[4839]: I0321 05:38:33.834829 4839 generic.go:334] "Generic (PLEG): container finished" podID="5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" containerID="14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e" exitCode=0 Mar 21 05:38:33 crc kubenswrapper[4839]: I0321 05:38:33.834937 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ppvlf/must-gather-sjwj7" event={"ID":"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d","Type":"ContainerDied","Data":"14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e"} Mar 21 05:38:33 crc kubenswrapper[4839]: I0321 05:38:33.836139 4839 scope.go:117] "RemoveContainer" containerID="14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e" Mar 21 05:38:34 crc kubenswrapper[4839]: I0321 05:38:34.243043 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ppvlf_must-gather-sjwj7_5072f4c5-1de6-4d8c-b69c-72d081fc7a0d/gather/0.log" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.265332 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ll62v"] Mar 21 05:38:42 crc kubenswrapper[4839]: E0321 05:38:42.266432 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f" containerName="oc" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.266450 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f" containerName="oc" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.268036 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="f62e08ee-35a1-4db2-9d9a-de4f9de7fd5f" containerName="oc" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.269419 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.289665 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ll62v"] Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.345536 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk5px\" (UniqueName: \"kubernetes.io/projected/78b94c3e-17dd-4253-8aed-25de5cbc0215-kube-api-access-dk5px\") pod \"certified-operators-ll62v\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.345608 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-catalog-content\") pod \"certified-operators-ll62v\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.345705 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-utilities\") pod \"certified-operators-ll62v\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.447289 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk5px\" (UniqueName: \"kubernetes.io/projected/78b94c3e-17dd-4253-8aed-25de5cbc0215-kube-api-access-dk5px\") pod \"certified-operators-ll62v\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.447354 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-catalog-content\") pod \"certified-operators-ll62v\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.447416 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-utilities\") pod \"certified-operators-ll62v\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.448006 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-catalog-content\") pod \"certified-operators-ll62v\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.448095 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-utilities\") pod \"certified-operators-ll62v\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.477422 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk5px\" (UniqueName: \"kubernetes.io/projected/78b94c3e-17dd-4253-8aed-25de5cbc0215-kube-api-access-dk5px\") pod \"certified-operators-ll62v\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:42 crc kubenswrapper[4839]: I0321 05:38:42.591094 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:43 crc kubenswrapper[4839]: I0321 05:38:43.140273 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ll62v"] Mar 21 05:38:43 crc kubenswrapper[4839]: I0321 05:38:43.949655 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ll62v" event={"ID":"78b94c3e-17dd-4253-8aed-25de5cbc0215","Type":"ContainerStarted","Data":"660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166"} Mar 21 05:38:43 crc kubenswrapper[4839]: I0321 05:38:43.949728 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ll62v" event={"ID":"78b94c3e-17dd-4253-8aed-25de5cbc0215","Type":"ContainerStarted","Data":"240f05dbf34fd755fdacb0ccc96083ada20d0c3272a8aa95239fc19a9ccae79a"} Mar 21 05:38:44 crc kubenswrapper[4839]: I0321 05:38:44.961188 4839 generic.go:334] "Generic (PLEG): container finished" podID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerID="660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166" exitCode=0 Mar 21 05:38:44 crc kubenswrapper[4839]: I0321 05:38:44.961293 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ll62v" event={"ID":"78b94c3e-17dd-4253-8aed-25de5cbc0215","Type":"ContainerDied","Data":"660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166"} Mar 21 05:38:44 crc kubenswrapper[4839]: I0321 05:38:44.961394 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ll62v" event={"ID":"78b94c3e-17dd-4253-8aed-25de5cbc0215","Type":"ContainerStarted","Data":"d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448"} Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.147778 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ppvlf/must-gather-sjwj7"] Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.148108 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-ppvlf/must-gather-sjwj7" podUID="5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" containerName="copy" containerID="cri-o://048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8" gracePeriod=2 Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.157434 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ppvlf/must-gather-sjwj7"] Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.587708 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ppvlf_must-gather-sjwj7_5072f4c5-1de6-4d8c-b69c-72d081fc7a0d/copy/0.log" Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.588528 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/must-gather-sjwj7" Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.715160 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-must-gather-output\") pod \"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d\" (UID: \"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d\") " Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.715279 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx5vq\" (UniqueName: \"kubernetes.io/projected/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-kube-api-access-rx5vq\") pod \"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d\" (UID: \"5072f4c5-1de6-4d8c-b69c-72d081fc7a0d\") " Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.721921 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-kube-api-access-rx5vq" (OuterVolumeSpecName: "kube-api-access-rx5vq") pod "5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" (UID: "5072f4c5-1de6-4d8c-b69c-72d081fc7a0d"). InnerVolumeSpecName "kube-api-access-rx5vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.817948 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx5vq\" (UniqueName: \"kubernetes.io/projected/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-kube-api-access-rx5vq\") on node \"crc\" DevicePath \"\"" Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.878916 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" (UID: "5072f4c5-1de6-4d8c-b69c-72d081fc7a0d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.920351 4839 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.971643 4839 generic.go:334] "Generic (PLEG): container finished" podID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerID="d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448" exitCode=0 Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.971717 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ll62v" event={"ID":"78b94c3e-17dd-4253-8aed-25de5cbc0215","Type":"ContainerDied","Data":"d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448"} Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.974435 4839 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ppvlf_must-gather-sjwj7_5072f4c5-1de6-4d8c-b69c-72d081fc7a0d/copy/0.log" Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.974901 4839 generic.go:334] "Generic (PLEG): container finished" podID="5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" containerID="048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8" exitCode=143 Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.974934 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ppvlf/must-gather-sjwj7" Mar 21 05:38:45 crc kubenswrapper[4839]: I0321 05:38:45.974974 4839 scope.go:117] "RemoveContainer" containerID="048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8" Mar 21 05:38:46 crc kubenswrapper[4839]: I0321 05:38:46.004720 4839 scope.go:117] "RemoveContainer" containerID="14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e" Mar 21 05:38:46 crc kubenswrapper[4839]: I0321 05:38:46.055015 4839 scope.go:117] "RemoveContainer" containerID="048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8" Mar 21 05:38:46 crc kubenswrapper[4839]: E0321 05:38:46.055460 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8\": container with ID starting with 048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8 not found: ID does not exist" containerID="048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8" Mar 21 05:38:46 crc kubenswrapper[4839]: I0321 05:38:46.055496 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8"} err="failed to get container status \"048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8\": rpc error: code = NotFound desc = could not find container \"048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8\": container with ID starting with 048a66db59bdf132d5c901dfec60d6fc743b835eddb4d17b55a8605cd2b2d8b8 not found: ID does not exist" Mar 21 05:38:46 crc kubenswrapper[4839]: I0321 05:38:46.055516 4839 scope.go:117] "RemoveContainer" containerID="14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e" Mar 21 05:38:46 crc kubenswrapper[4839]: E0321 05:38:46.055951 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e\": container with ID starting with 14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e not found: ID does not exist" containerID="14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e" Mar 21 05:38:46 crc kubenswrapper[4839]: I0321 05:38:46.055982 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e"} err="failed to get container status \"14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e\": rpc error: code = NotFound desc = could not find container \"14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e\": container with ID starting with 14e8bebce0253c85aad33e0b71ce4edd291dedb5a18a4452ca51e078dab4db0e not found: ID does not exist" Mar 21 05:38:46 crc kubenswrapper[4839]: I0321 05:38:46.466205 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" path="/var/lib/kubelet/pods/5072f4c5-1de6-4d8c-b69c-72d081fc7a0d/volumes" Mar 21 05:38:46 crc kubenswrapper[4839]: I0321 05:38:46.984497 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ll62v" event={"ID":"78b94c3e-17dd-4253-8aed-25de5cbc0215","Type":"ContainerStarted","Data":"3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1"} Mar 21 05:38:47 crc kubenswrapper[4839]: I0321 05:38:47.010656 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ll62v" podStartSLOduration=2.510352924 podStartE2EDuration="5.010636726s" podCreationTimestamp="2026-03-21 05:38:42 +0000 UTC" firstStartedPulling="2026-03-21 05:38:43.952029078 +0000 UTC m=+4528.279815764" lastFinishedPulling="2026-03-21 05:38:46.45231289 +0000 UTC m=+4530.780099566" observedRunningTime="2026-03-21 05:38:47.002768764 +0000 UTC m=+4531.330555450" watchObservedRunningTime="2026-03-21 05:38:47.010636726 +0000 UTC m=+4531.338423402" Mar 21 05:38:52 crc kubenswrapper[4839]: I0321 05:38:52.591382 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:52 crc kubenswrapper[4839]: I0321 05:38:52.591984 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:52 crc kubenswrapper[4839]: I0321 05:38:52.819072 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:53 crc kubenswrapper[4839]: I0321 05:38:53.088304 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:53 crc kubenswrapper[4839]: I0321 05:38:53.149261 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ll62v"] Mar 21 05:38:55 crc kubenswrapper[4839]: I0321 05:38:55.055845 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ll62v" podUID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerName="registry-server" containerID="cri-o://3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1" gracePeriod=2 Mar 21 05:38:55 crc kubenswrapper[4839]: I0321 05:38:55.508687 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:55 crc kubenswrapper[4839]: I0321 05:38:55.624323 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk5px\" (UniqueName: \"kubernetes.io/projected/78b94c3e-17dd-4253-8aed-25de5cbc0215-kube-api-access-dk5px\") pod \"78b94c3e-17dd-4253-8aed-25de5cbc0215\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " Mar 21 05:38:55 crc kubenswrapper[4839]: I0321 05:38:55.624507 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-catalog-content\") pod \"78b94c3e-17dd-4253-8aed-25de5cbc0215\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " Mar 21 05:38:55 crc kubenswrapper[4839]: I0321 05:38:55.625017 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-utilities\") pod \"78b94c3e-17dd-4253-8aed-25de5cbc0215\" (UID: \"78b94c3e-17dd-4253-8aed-25de5cbc0215\") " Mar 21 05:38:55 crc kubenswrapper[4839]: I0321 05:38:55.626306 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-utilities" (OuterVolumeSpecName: "utilities") pod "78b94c3e-17dd-4253-8aed-25de5cbc0215" (UID: "78b94c3e-17dd-4253-8aed-25de5cbc0215"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:38:55 crc kubenswrapper[4839]: I0321 05:38:55.636148 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78b94c3e-17dd-4253-8aed-25de5cbc0215-kube-api-access-dk5px" (OuterVolumeSpecName: "kube-api-access-dk5px") pod "78b94c3e-17dd-4253-8aed-25de5cbc0215" (UID: "78b94c3e-17dd-4253-8aed-25de5cbc0215"). InnerVolumeSpecName "kube-api-access-dk5px". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:38:55 crc kubenswrapper[4839]: I0321 05:38:55.728424 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:38:55 crc kubenswrapper[4839]: I0321 05:38:55.728837 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk5px\" (UniqueName: \"kubernetes.io/projected/78b94c3e-17dd-4253-8aed-25de5cbc0215-kube-api-access-dk5px\") on node \"crc\" DevicePath \"\"" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.071391 4839 generic.go:334] "Generic (PLEG): container finished" podID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerID="3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1" exitCode=0 Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.071466 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ll62v" event={"ID":"78b94c3e-17dd-4253-8aed-25de5cbc0215","Type":"ContainerDied","Data":"3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1"} Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.071563 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ll62v" event={"ID":"78b94c3e-17dd-4253-8aed-25de5cbc0215","Type":"ContainerDied","Data":"240f05dbf34fd755fdacb0ccc96083ada20d0c3272a8aa95239fc19a9ccae79a"} Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.071624 4839 scope.go:117] "RemoveContainer" containerID="3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.074526 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ll62v" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.122518 4839 scope.go:117] "RemoveContainer" containerID="d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.153685 4839 scope.go:117] "RemoveContainer" containerID="660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.197069 4839 scope.go:117] "RemoveContainer" containerID="3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1" Mar 21 05:38:56 crc kubenswrapper[4839]: E0321 05:38:56.197711 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1\": container with ID starting with 3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1 not found: ID does not exist" containerID="3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.197781 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1"} err="failed to get container status \"3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1\": rpc error: code = NotFound desc = could not find container \"3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1\": container with ID starting with 3c1fee562067ca9358c26f54d92cb826bf4b9fffd2460d92100449ea81f1bed1 not found: ID does not exist" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.197826 4839 scope.go:117] "RemoveContainer" containerID="d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448" Mar 21 05:38:56 crc kubenswrapper[4839]: E0321 05:38:56.198134 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448\": container with ID starting with d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448 not found: ID does not exist" containerID="d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.198168 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448"} err="failed to get container status \"d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448\": rpc error: code = NotFound desc = could not find container \"d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448\": container with ID starting with d455542feefa66517496e53183dac2ab3a6f9cbc8f06950babf0fd1a21727448 not found: ID does not exist" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.198183 4839 scope.go:117] "RemoveContainer" containerID="660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166" Mar 21 05:38:56 crc kubenswrapper[4839]: E0321 05:38:56.198374 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166\": container with ID starting with 660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166 not found: ID does not exist" containerID="660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.198395 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166"} err="failed to get container status \"660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166\": rpc error: code = NotFound desc = could not find container \"660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166\": container with ID starting with 660ab990dd2c9102a94a5ac35184e719e4e6c6f722f12c8ccf1fe4b3ae9ff166 not found: ID does not exist" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.512695 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78b94c3e-17dd-4253-8aed-25de5cbc0215" (UID: "78b94c3e-17dd-4253-8aed-25de5cbc0215"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.548085 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b94c3e-17dd-4253-8aed-25de5cbc0215-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.712919 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ll62v"] Mar 21 05:38:56 crc kubenswrapper[4839]: I0321 05:38:56.726376 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ll62v"] Mar 21 05:38:58 crc kubenswrapper[4839]: I0321 05:38:58.465139 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78b94c3e-17dd-4253-8aed-25de5cbc0215" path="/var/lib/kubelet/pods/78b94c3e-17dd-4253-8aed-25de5cbc0215/volumes" Mar 21 05:38:58 crc kubenswrapper[4839]: I0321 05:38:58.625012 4839 scope.go:117] "RemoveContainer" containerID="e0aaa7c76a0ee9b1660ca2e309fd9d60f43c9f5876dc19d939b4dd884d137805" Mar 21 05:38:58 crc kubenswrapper[4839]: I0321 05:38:58.653596 4839 scope.go:117] "RemoveContainer" containerID="cc3802ac333d73f4abb16330d261760555d938cdc36d0050dadf5466674b13ba" Mar 21 05:39:58 crc kubenswrapper[4839]: I0321 05:39:58.845476 4839 scope.go:117] "RemoveContainer" containerID="d991b608c7dd15cd8e8f6e12d6073ad24091724986f4f1fa631390572cd83d55" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.148246 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567860-qprns"] Mar 21 05:40:00 crc kubenswrapper[4839]: E0321 05:40:00.149109 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" containerName="gather" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.149128 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" containerName="gather" Mar 21 05:40:00 crc kubenswrapper[4839]: E0321 05:40:00.149148 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerName="registry-server" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.149156 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerName="registry-server" Mar 21 05:40:00 crc kubenswrapper[4839]: E0321 05:40:00.149188 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" containerName="copy" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.149198 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" containerName="copy" Mar 21 05:40:00 crc kubenswrapper[4839]: E0321 05:40:00.149215 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerName="extract-utilities" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.149223 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerName="extract-utilities" Mar 21 05:40:00 crc kubenswrapper[4839]: E0321 05:40:00.149250 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerName="extract-content" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.149258 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerName="extract-content" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.149450 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b94c3e-17dd-4253-8aed-25de5cbc0215" containerName="registry-server" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.149487 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" containerName="gather" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.149504 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="5072f4c5-1de6-4d8c-b69c-72d081fc7a0d" containerName="copy" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.150321 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567860-qprns" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.152400 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.152548 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.153344 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.163272 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567860-qprns"] Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.194394 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5f2k\" (UniqueName: \"kubernetes.io/projected/3bb631e5-c431-43b1-8e8b-ebe2a9e4842d-kube-api-access-p5f2k\") pod \"auto-csr-approver-29567860-qprns\" (UID: \"3bb631e5-c431-43b1-8e8b-ebe2a9e4842d\") " pod="openshift-infra/auto-csr-approver-29567860-qprns" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.297203 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5f2k\" (UniqueName: \"kubernetes.io/projected/3bb631e5-c431-43b1-8e8b-ebe2a9e4842d-kube-api-access-p5f2k\") pod \"auto-csr-approver-29567860-qprns\" (UID: \"3bb631e5-c431-43b1-8e8b-ebe2a9e4842d\") " pod="openshift-infra/auto-csr-approver-29567860-qprns" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.331052 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5f2k\" (UniqueName: \"kubernetes.io/projected/3bb631e5-c431-43b1-8e8b-ebe2a9e4842d-kube-api-access-p5f2k\") pod \"auto-csr-approver-29567860-qprns\" (UID: \"3bb631e5-c431-43b1-8e8b-ebe2a9e4842d\") " pod="openshift-infra/auto-csr-approver-29567860-qprns" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.500298 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567860-qprns" Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.963015 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567860-qprns"] Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.981181 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:40:00 crc kubenswrapper[4839]: I0321 05:40:00.981288 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:40:01 crc kubenswrapper[4839]: I0321 05:40:01.514861 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567860-qprns" event={"ID":"3bb631e5-c431-43b1-8e8b-ebe2a9e4842d","Type":"ContainerStarted","Data":"36e0c52dbe7f2a3766441d105e4720ecb523bb955499d35c4c63009ee9ac6b0b"} Mar 21 05:40:03 crc kubenswrapper[4839]: I0321 05:40:03.533101 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567860-qprns" event={"ID":"3bb631e5-c431-43b1-8e8b-ebe2a9e4842d","Type":"ContainerStarted","Data":"1abaabd278447b88afde2bcc37993f253c31a34d2313296e6e7167afadc39a08"} Mar 21 05:40:03 crc kubenswrapper[4839]: I0321 05:40:03.553109 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567860-qprns" podStartSLOduration=1.627987514 podStartE2EDuration="3.553093473s" podCreationTimestamp="2026-03-21 05:40:00 +0000 UTC" firstStartedPulling="2026-03-21 05:40:00.965990521 +0000 UTC m=+4605.293777197" lastFinishedPulling="2026-03-21 05:40:02.89109647 +0000 UTC m=+4607.218883156" observedRunningTime="2026-03-21 05:40:03.549851152 +0000 UTC m=+4607.877637828" watchObservedRunningTime="2026-03-21 05:40:03.553093473 +0000 UTC m=+4607.880880149" Mar 21 05:40:04 crc kubenswrapper[4839]: I0321 05:40:04.544439 4839 generic.go:334] "Generic (PLEG): container finished" podID="3bb631e5-c431-43b1-8e8b-ebe2a9e4842d" containerID="1abaabd278447b88afde2bcc37993f253c31a34d2313296e6e7167afadc39a08" exitCode=0 Mar 21 05:40:04 crc kubenswrapper[4839]: I0321 05:40:04.544539 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567860-qprns" event={"ID":"3bb631e5-c431-43b1-8e8b-ebe2a9e4842d","Type":"ContainerDied","Data":"1abaabd278447b88afde2bcc37993f253c31a34d2313296e6e7167afadc39a08"} Mar 21 05:40:05 crc kubenswrapper[4839]: I0321 05:40:05.896590 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567860-qprns" Mar 21 05:40:05 crc kubenswrapper[4839]: I0321 05:40:05.903393 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5f2k\" (UniqueName: \"kubernetes.io/projected/3bb631e5-c431-43b1-8e8b-ebe2a9e4842d-kube-api-access-p5f2k\") pod \"3bb631e5-c431-43b1-8e8b-ebe2a9e4842d\" (UID: \"3bb631e5-c431-43b1-8e8b-ebe2a9e4842d\") " Mar 21 05:40:05 crc kubenswrapper[4839]: I0321 05:40:05.912641 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bb631e5-c431-43b1-8e8b-ebe2a9e4842d-kube-api-access-p5f2k" (OuterVolumeSpecName: "kube-api-access-p5f2k") pod "3bb631e5-c431-43b1-8e8b-ebe2a9e4842d" (UID: "3bb631e5-c431-43b1-8e8b-ebe2a9e4842d"). InnerVolumeSpecName "kube-api-access-p5f2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:40:06 crc kubenswrapper[4839]: I0321 05:40:06.005052 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5f2k\" (UniqueName: \"kubernetes.io/projected/3bb631e5-c431-43b1-8e8b-ebe2a9e4842d-kube-api-access-p5f2k\") on node \"crc\" DevicePath \"\"" Mar 21 05:40:06 crc kubenswrapper[4839]: I0321 05:40:06.564383 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567860-qprns" event={"ID":"3bb631e5-c431-43b1-8e8b-ebe2a9e4842d","Type":"ContainerDied","Data":"36e0c52dbe7f2a3766441d105e4720ecb523bb955499d35c4c63009ee9ac6b0b"} Mar 21 05:40:06 crc kubenswrapper[4839]: I0321 05:40:06.564455 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36e0c52dbe7f2a3766441d105e4720ecb523bb955499d35c4c63009ee9ac6b0b" Mar 21 05:40:06 crc kubenswrapper[4839]: I0321 05:40:06.564538 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567860-qprns" Mar 21 05:40:06 crc kubenswrapper[4839]: I0321 05:40:06.623517 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567854-85fvh"] Mar 21 05:40:06 crc kubenswrapper[4839]: I0321 05:40:06.634793 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567854-85fvh"] Mar 21 05:40:08 crc kubenswrapper[4839]: I0321 05:40:08.474814 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bec1d36d-4ff4-4f29-9d04-59f088e00f09" path="/var/lib/kubelet/pods/bec1d36d-4ff4-4f29-9d04-59f088e00f09/volumes" Mar 21 05:40:30 crc kubenswrapper[4839]: I0321 05:40:30.980012 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:40:30 crc kubenswrapper[4839]: I0321 05:40:30.981696 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:40:58 crc kubenswrapper[4839]: I0321 05:40:58.899282 4839 scope.go:117] "RemoveContainer" containerID="1b773a94d9de7762b645d818a8305a6aa83ff1f49522be66070dc127da6682d7" Mar 21 05:41:00 crc kubenswrapper[4839]: I0321 05:41:00.980613 4839 patch_prober.go:28] interesting pod/machine-config-daemon-jx4q7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 21 05:41:00 crc kubenswrapper[4839]: I0321 05:41:00.981001 4839 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 21 05:41:00 crc kubenswrapper[4839]: I0321 05:41:00.981060 4839 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" Mar 21 05:41:00 crc kubenswrapper[4839]: I0321 05:41:00.981719 4839 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58"} pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 21 05:41:00 crc kubenswrapper[4839]: I0321 05:41:00.981803 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerName="machine-config-daemon" containerID="cri-o://3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" gracePeriod=600 Mar 21 05:41:01 crc kubenswrapper[4839]: E0321 05:41:01.175161 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:41:01 crc kubenswrapper[4839]: I0321 05:41:01.334393 4839 generic.go:334] "Generic (PLEG): container finished" podID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" exitCode=0 Mar 21 05:41:01 crc kubenswrapper[4839]: I0321 05:41:01.334467 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" event={"ID":"4f92fefb-d5cd-451a-8bbe-31eea55d5bd9","Type":"ContainerDied","Data":"3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58"} Mar 21 05:41:01 crc kubenswrapper[4839]: I0321 05:41:01.334534 4839 scope.go:117] "RemoveContainer" containerID="c9fc249e5d17a6c2fdbc1eaec440716c42661d1c2e7c8e6b17923104003e02fe" Mar 21 05:41:01 crc kubenswrapper[4839]: I0321 05:41:01.335530 4839 scope.go:117] "RemoveContainer" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" Mar 21 05:41:01 crc kubenswrapper[4839]: E0321 05:41:01.336046 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:41:16 crc kubenswrapper[4839]: I0321 05:41:16.461474 4839 scope.go:117] "RemoveContainer" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" Mar 21 05:41:16 crc kubenswrapper[4839]: E0321 05:41:16.463243 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:41:31 crc kubenswrapper[4839]: I0321 05:41:31.453944 4839 scope.go:117] "RemoveContainer" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" Mar 21 05:41:31 crc kubenswrapper[4839]: E0321 05:41:31.454736 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:41:42 crc kubenswrapper[4839]: I0321 05:41:42.453115 4839 scope.go:117] "RemoveContainer" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" Mar 21 05:41:42 crc kubenswrapper[4839]: E0321 05:41:42.453840 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:41:55 crc kubenswrapper[4839]: I0321 05:41:55.453370 4839 scope.go:117] "RemoveContainer" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" Mar 21 05:41:55 crc kubenswrapper[4839]: E0321 05:41:55.455162 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.165618 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567862-dqhdk"] Mar 21 05:42:00 crc kubenswrapper[4839]: E0321 05:42:00.166651 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb631e5-c431-43b1-8e8b-ebe2a9e4842d" containerName="oc" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.166663 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb631e5-c431-43b1-8e8b-ebe2a9e4842d" containerName="oc" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.166834 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bb631e5-c431-43b1-8e8b-ebe2a9e4842d" containerName="oc" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.167597 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567862-dqhdk" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.169692 4839 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-swld2" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.170665 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.170829 4839 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.185667 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567862-dqhdk"] Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.328378 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c675\" (UniqueName: \"kubernetes.io/projected/310ca8e3-f2ad-491a-9453-3fc357628cd3-kube-api-access-8c675\") pod \"auto-csr-approver-29567862-dqhdk\" (UID: \"310ca8e3-f2ad-491a-9453-3fc357628cd3\") " pod="openshift-infra/auto-csr-approver-29567862-dqhdk" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.430240 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c675\" (UniqueName: \"kubernetes.io/projected/310ca8e3-f2ad-491a-9453-3fc357628cd3-kube-api-access-8c675\") pod \"auto-csr-approver-29567862-dqhdk\" (UID: \"310ca8e3-f2ad-491a-9453-3fc357628cd3\") " pod="openshift-infra/auto-csr-approver-29567862-dqhdk" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.448783 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c675\" (UniqueName: \"kubernetes.io/projected/310ca8e3-f2ad-491a-9453-3fc357628cd3-kube-api-access-8c675\") pod \"auto-csr-approver-29567862-dqhdk\" (UID: \"310ca8e3-f2ad-491a-9453-3fc357628cd3\") " pod="openshift-infra/auto-csr-approver-29567862-dqhdk" Mar 21 05:42:00 crc kubenswrapper[4839]: I0321 05:42:00.488931 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567862-dqhdk" Mar 21 05:42:01 crc kubenswrapper[4839]: I0321 05:42:01.295078 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567862-dqhdk"] Mar 21 05:42:01 crc kubenswrapper[4839]: I0321 05:42:01.301773 4839 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 21 05:42:02 crc kubenswrapper[4839]: I0321 05:42:02.084935 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567862-dqhdk" event={"ID":"310ca8e3-f2ad-491a-9453-3fc357628cd3","Type":"ContainerStarted","Data":"2a3365bede70aacb7ee31c031fed2b2720edc09728bd74044d8235e6cd6bccf7"} Mar 21 05:42:04 crc kubenswrapper[4839]: I0321 05:42:04.115254 4839 generic.go:334] "Generic (PLEG): container finished" podID="310ca8e3-f2ad-491a-9453-3fc357628cd3" containerID="93da64406c417a6e2ac4bf006b6bc4a4396a6d01c348ab94c9c8fddd70192ce0" exitCode=0 Mar 21 05:42:04 crc kubenswrapper[4839]: I0321 05:42:04.115323 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567862-dqhdk" event={"ID":"310ca8e3-f2ad-491a-9453-3fc357628cd3","Type":"ContainerDied","Data":"93da64406c417a6e2ac4bf006b6bc4a4396a6d01c348ab94c9c8fddd70192ce0"} Mar 21 05:42:05 crc kubenswrapper[4839]: I0321 05:42:05.525521 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567862-dqhdk" Mar 21 05:42:05 crc kubenswrapper[4839]: I0321 05:42:05.585521 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c675\" (UniqueName: \"kubernetes.io/projected/310ca8e3-f2ad-491a-9453-3fc357628cd3-kube-api-access-8c675\") pod \"310ca8e3-f2ad-491a-9453-3fc357628cd3\" (UID: \"310ca8e3-f2ad-491a-9453-3fc357628cd3\") " Mar 21 05:42:05 crc kubenswrapper[4839]: I0321 05:42:05.593157 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/310ca8e3-f2ad-491a-9453-3fc357628cd3-kube-api-access-8c675" (OuterVolumeSpecName: "kube-api-access-8c675") pod "310ca8e3-f2ad-491a-9453-3fc357628cd3" (UID: "310ca8e3-f2ad-491a-9453-3fc357628cd3"). InnerVolumeSpecName "kube-api-access-8c675". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:42:05 crc kubenswrapper[4839]: I0321 05:42:05.687246 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c675\" (UniqueName: \"kubernetes.io/projected/310ca8e3-f2ad-491a-9453-3fc357628cd3-kube-api-access-8c675\") on node \"crc\" DevicePath \"\"" Mar 21 05:42:06 crc kubenswrapper[4839]: I0321 05:42:06.134284 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567862-dqhdk" event={"ID":"310ca8e3-f2ad-491a-9453-3fc357628cd3","Type":"ContainerDied","Data":"2a3365bede70aacb7ee31c031fed2b2720edc09728bd74044d8235e6cd6bccf7"} Mar 21 05:42:06 crc kubenswrapper[4839]: I0321 05:42:06.134319 4839 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a3365bede70aacb7ee31c031fed2b2720edc09728bd74044d8235e6cd6bccf7" Mar 21 05:42:06 crc kubenswrapper[4839]: I0321 05:42:06.134676 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567862-dqhdk" Mar 21 05:42:06 crc kubenswrapper[4839]: I0321 05:42:06.604436 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567856-pzqfl"] Mar 21 05:42:06 crc kubenswrapper[4839]: I0321 05:42:06.613766 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567856-pzqfl"] Mar 21 05:42:08 crc kubenswrapper[4839]: I0321 05:42:08.453500 4839 scope.go:117] "RemoveContainer" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" Mar 21 05:42:08 crc kubenswrapper[4839]: E0321 05:42:08.459526 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:42:08 crc kubenswrapper[4839]: I0321 05:42:08.476242 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a61d1142-1394-4cf7-a8f7-6f1841a6694d" path="/var/lib/kubelet/pods/a61d1142-1394-4cf7-a8f7-6f1841a6694d/volumes" Mar 21 05:42:19 crc kubenswrapper[4839]: I0321 05:42:19.453205 4839 scope.go:117] "RemoveContainer" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" Mar 21 05:42:19 crc kubenswrapper[4839]: E0321 05:42:19.453861 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.753720 4839 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nqgqm"] Mar 21 05:42:23 crc kubenswrapper[4839]: E0321 05:42:23.754777 4839 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="310ca8e3-f2ad-491a-9453-3fc357628cd3" containerName="oc" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.754791 4839 state_mem.go:107] "Deleted CPUSet assignment" podUID="310ca8e3-f2ad-491a-9453-3fc357628cd3" containerName="oc" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.754968 4839 memory_manager.go:354] "RemoveStaleState removing state" podUID="310ca8e3-f2ad-491a-9453-3fc357628cd3" containerName="oc" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.756322 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.766643 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nqgqm"] Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.837922 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-catalog-content\") pod \"community-operators-nqgqm\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.837986 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp6sz\" (UniqueName: \"kubernetes.io/projected/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-kube-api-access-hp6sz\") pod \"community-operators-nqgqm\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.838016 4839 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-utilities\") pod \"community-operators-nqgqm\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.939255 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-catalog-content\") pod \"community-operators-nqgqm\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.939302 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp6sz\" (UniqueName: \"kubernetes.io/projected/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-kube-api-access-hp6sz\") pod \"community-operators-nqgqm\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.939332 4839 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-utilities\") pod \"community-operators-nqgqm\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.939831 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-catalog-content\") pod \"community-operators-nqgqm\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.939863 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-utilities\") pod \"community-operators-nqgqm\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:23 crc kubenswrapper[4839]: I0321 05:42:23.960121 4839 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp6sz\" (UniqueName: \"kubernetes.io/projected/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-kube-api-access-hp6sz\") pod \"community-operators-nqgqm\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:24 crc kubenswrapper[4839]: I0321 05:42:24.112943 4839 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:24 crc kubenswrapper[4839]: I0321 05:42:24.651745 4839 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nqgqm"] Mar 21 05:42:25 crc kubenswrapper[4839]: I0321 05:42:25.605526 4839 generic.go:334] "Generic (PLEG): container finished" podID="31974c6b-82d8-4d18-9dc2-9a7f29374d2f" containerID="2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af" exitCode=0 Mar 21 05:42:25 crc kubenswrapper[4839]: I0321 05:42:25.605607 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqgqm" event={"ID":"31974c6b-82d8-4d18-9dc2-9a7f29374d2f","Type":"ContainerDied","Data":"2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af"} Mar 21 05:42:25 crc kubenswrapper[4839]: I0321 05:42:25.605835 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqgqm" event={"ID":"31974c6b-82d8-4d18-9dc2-9a7f29374d2f","Type":"ContainerStarted","Data":"69a350c23f645fd571cfc04982724c5750af661fc4c28d11db24e0b6e55d97e9"} Mar 21 05:42:27 crc kubenswrapper[4839]: I0321 05:42:27.626184 4839 generic.go:334] "Generic (PLEG): container finished" podID="31974c6b-82d8-4d18-9dc2-9a7f29374d2f" containerID="b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00" exitCode=0 Mar 21 05:42:27 crc kubenswrapper[4839]: I0321 05:42:27.626256 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqgqm" event={"ID":"31974c6b-82d8-4d18-9dc2-9a7f29374d2f","Type":"ContainerDied","Data":"b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00"} Mar 21 05:42:28 crc kubenswrapper[4839]: I0321 05:42:28.648790 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqgqm" event={"ID":"31974c6b-82d8-4d18-9dc2-9a7f29374d2f","Type":"ContainerStarted","Data":"8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571"} Mar 21 05:42:28 crc kubenswrapper[4839]: I0321 05:42:28.678147 4839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nqgqm" podStartSLOduration=3.240137242 podStartE2EDuration="5.678125436s" podCreationTimestamp="2026-03-21 05:42:23 +0000 UTC" firstStartedPulling="2026-03-21 05:42:25.607720065 +0000 UTC m=+4749.935506741" lastFinishedPulling="2026-03-21 05:42:28.045708259 +0000 UTC m=+4752.373494935" observedRunningTime="2026-03-21 05:42:28.67545032 +0000 UTC m=+4753.003237026" watchObservedRunningTime="2026-03-21 05:42:28.678125436 +0000 UTC m=+4753.005912112" Mar 21 05:42:34 crc kubenswrapper[4839]: I0321 05:42:34.114042 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:34 crc kubenswrapper[4839]: I0321 05:42:34.114323 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:34 crc kubenswrapper[4839]: I0321 05:42:34.162668 4839 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:34 crc kubenswrapper[4839]: I0321 05:42:34.454061 4839 scope.go:117] "RemoveContainer" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" Mar 21 05:42:34 crc kubenswrapper[4839]: E0321 05:42:34.454467 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:42:34 crc kubenswrapper[4839]: I0321 05:42:34.760885 4839 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:34 crc kubenswrapper[4839]: I0321 05:42:34.821956 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nqgqm"] Mar 21 05:42:36 crc kubenswrapper[4839]: I0321 05:42:36.731005 4839 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nqgqm" podUID="31974c6b-82d8-4d18-9dc2-9a7f29374d2f" containerName="registry-server" containerID="cri-o://8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571" gracePeriod=2 Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.224342 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.328456 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-catalog-content\") pod \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.328555 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp6sz\" (UniqueName: \"kubernetes.io/projected/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-kube-api-access-hp6sz\") pod \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.328747 4839 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-utilities\") pod \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\" (UID: \"31974c6b-82d8-4d18-9dc2-9a7f29374d2f\") " Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.330046 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-utilities" (OuterVolumeSpecName: "utilities") pod "31974c6b-82d8-4d18-9dc2-9a7f29374d2f" (UID: "31974c6b-82d8-4d18-9dc2-9a7f29374d2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.338409 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-kube-api-access-hp6sz" (OuterVolumeSpecName: "kube-api-access-hp6sz") pod "31974c6b-82d8-4d18-9dc2-9a7f29374d2f" (UID: "31974c6b-82d8-4d18-9dc2-9a7f29374d2f"). InnerVolumeSpecName "kube-api-access-hp6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.388282 4839 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31974c6b-82d8-4d18-9dc2-9a7f29374d2f" (UID: "31974c6b-82d8-4d18-9dc2-9a7f29374d2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.430884 4839 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.431150 4839 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp6sz\" (UniqueName: \"kubernetes.io/projected/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-kube-api-access-hp6sz\") on node \"crc\" DevicePath \"\"" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.431212 4839 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31974c6b-82d8-4d18-9dc2-9a7f29374d2f-utilities\") on node \"crc\" DevicePath \"\"" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.742037 4839 generic.go:334] "Generic (PLEG): container finished" podID="31974c6b-82d8-4d18-9dc2-9a7f29374d2f" containerID="8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571" exitCode=0 Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.742093 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqgqm" event={"ID":"31974c6b-82d8-4d18-9dc2-9a7f29374d2f","Type":"ContainerDied","Data":"8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571"} Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.742172 4839 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nqgqm" event={"ID":"31974c6b-82d8-4d18-9dc2-9a7f29374d2f","Type":"ContainerDied","Data":"69a350c23f645fd571cfc04982724c5750af661fc4c28d11db24e0b6e55d97e9"} Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.742199 4839 scope.go:117] "RemoveContainer" containerID="8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.744953 4839 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nqgqm" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.763008 4839 scope.go:117] "RemoveContainer" containerID="b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.788719 4839 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nqgqm"] Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.796340 4839 scope.go:117] "RemoveContainer" containerID="2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.802556 4839 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nqgqm"] Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.844905 4839 scope.go:117] "RemoveContainer" containerID="8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571" Mar 21 05:42:37 crc kubenswrapper[4839]: E0321 05:42:37.845315 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571\": container with ID starting with 8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571 not found: ID does not exist" containerID="8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.845358 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571"} err="failed to get container status \"8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571\": rpc error: code = NotFound desc = could not find container \"8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571\": container with ID starting with 8c7d3ddec3d41ff0172b9bc00312e6b00634bbc6afa0f48f2d02c9f8834b2571 not found: ID does not exist" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.845385 4839 scope.go:117] "RemoveContainer" containerID="b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00" Mar 21 05:42:37 crc kubenswrapper[4839]: E0321 05:42:37.845841 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00\": container with ID starting with b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00 not found: ID does not exist" containerID="b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.845913 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00"} err="failed to get container status \"b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00\": rpc error: code = NotFound desc = could not find container \"b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00\": container with ID starting with b374efae9e2d442339b1b9f7f2d9a90aeba73c19a68818da8da62ebd72f5fe00 not found: ID does not exist" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.845947 4839 scope.go:117] "RemoveContainer" containerID="2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af" Mar 21 05:42:37 crc kubenswrapper[4839]: E0321 05:42:37.846335 4839 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af\": container with ID starting with 2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af not found: ID does not exist" containerID="2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af" Mar 21 05:42:37 crc kubenswrapper[4839]: I0321 05:42:37.846358 4839 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af"} err="failed to get container status \"2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af\": rpc error: code = NotFound desc = could not find container \"2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af\": container with ID starting with 2c45cc0f8befc65f525c075b18ed919ad6ee83a090fe3d2d9ed2b9b0293e76af not found: ID does not exist" Mar 21 05:42:38 crc kubenswrapper[4839]: I0321 05:42:38.462395 4839 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31974c6b-82d8-4d18-9dc2-9a7f29374d2f" path="/var/lib/kubelet/pods/31974c6b-82d8-4d18-9dc2-9a7f29374d2f/volumes" Mar 21 05:42:48 crc kubenswrapper[4839]: I0321 05:42:48.453029 4839 scope.go:117] "RemoveContainer" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" Mar 21 05:42:48 crc kubenswrapper[4839]: E0321 05:42:48.453811 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9" Mar 21 05:42:58 crc kubenswrapper[4839]: I0321 05:42:58.984224 4839 scope.go:117] "RemoveContainer" containerID="3b72b1b0e5d05a1d6603f6bb93e0270d894bb25f1de761d8b7c5c8644a45fe83" Mar 21 05:43:00 crc kubenswrapper[4839]: I0321 05:43:00.453313 4839 scope.go:117] "RemoveContainer" containerID="3ed2b1cd754eebd9a34fd22a5307aa90be9d9709c3e6c1624ea187562e9a5a58" Mar 21 05:43:00 crc kubenswrapper[4839]: E0321 05:43:00.453877 4839 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jx4q7_openshift-machine-config-operator(4f92fefb-d5cd-451a-8bbe-31eea55d5bd9)\"" pod="openshift-machine-config-operator/machine-config-daemon-jx4q7" podUID="4f92fefb-d5cd-451a-8bbe-31eea55d5bd9"